I write these words from a motel room just outside the Toronto airport. One of the risks of winter travel to Ontario is the sudden snowstorm that paralyzes traffic and grounds all planes. At least I had enough traveler's sense to secure a room near the airport during that gray period when rooms are still available and the airlines won't yet admit they're not going to fly. So I got to watch Mozart's Don Giovanni on the telly from a warm bed last night instead of camping out in a departure gate.
Another risk of winter travel is far worse. The same airline that is hauling me about just lost one of its planes outside Detroit during my brief trip. Twenty-nine of my fellow souls went down with it, ending all concerns about earthly comfort for them. Wing icing is the current leading suspect. So you won't find me complaining about flights canceled due to bad weather. I'll take safety over reliable scheduling any day of the week.
The same day that plane went down, Skytel's paging network went berserk for half an hour, sending bogus messages far and wide. The alleged cause: "a new customer was given a wrong kind of identification number." But we all know better. Only bad software could let a common human error wreak such havoc. So far I've read no reports of any lives lost as a result of that hiccup, or even any significant monetary loss. But I wouldn't want to be in Skytel's shoes today.
My light reading for this trip was Edward Tenner's Why Things Bite Back: Technology and the Revenge of Unintended Consequences (Alfred A. Knopf, 1996). Tenner was a classmate of mine at Princeton and a superb writer even then. He has only grown more thorough and exacting in the intervening years. It's a good read. And it's chock full of cautionary tales for those of us who would make complex systems, be they airlines or computer programs.
Tenner is not anti technology. Far from it. He is quick to point out the many ways that improved technology airplanes, computers, or whatever has improved our lot. We live longer than our forebears, and often in greater comfort with fewer catastrophes. But the second-order effects of our complex systems often pervert those first-order benefits. We live with more chronic illnesses, and often in greater fear of those fewer but more spectacular catastrophes.
I made this trip to visit a major customer and help get some of my software working better with the customer's product. It was a productive visit, but there are still a number of loose ends. When I get home, I have several bugs to track down. I plan to be extra careful about checking my work. Tenner's recurring theme is that complex systems demand continuous vigilance, if we don't want them to bite back.
Who knows, someone will probably someday use my software in a program that predicts when wing icing occurs, or in a program that queues up messages for pagers. Your software might suffer a similar success. Think about it.
P.J. Plauger