Dr. Dobb's Journal May 2004
Upon hearing that researchers discovered the world's oldest bug, what came to mind was a register in an assembly-language program that MOV'd where it shouldn't have, or a Lisp application that ended where it didn't begin. Alas, I missed the part about entomology instead of computer science. As it turns out, what University of Kansas professor Michael Engel actually stumbled across was Rhyniognatha, a fossilized insect that's more than 400-million years old. And, as with software bugs, Engel didn't have to root around to find itthe bug had been right there in a drawer at the Museum of Natural History in London for decades.
"After years and years of fieldwork," Engel said, "here it was at a museum. It's an amazing thing." That's the way it is with bugsthey hang around for years, then pop up when you least expect them.
In the computer-science domain, a "bug" is any fault in a program that causes the program to do something different than what was intended. Yes, viruses and worms may go by names such as the "Love Bug," but they really aren't bugsthey maliciously go about doing exactly what they're designed to.
Bugs are more than a nuisance. An exhaustive 2002 study conducted by the Department of Commerce's National Institute of Standards and Technology (NIST) pegged the annual cost of software bugs at $59.5 billion (http://www.nist.gov/director/prog-ofc/report02-3.pdf). Granted, this is based on 2000 statistics, in which total sales of software reached approximately $180 billion, supported by a workforce of 697,000 software engineers and 585,000 programmers. The study also discovered that anywhere from 50 to 75 percent of the total development cost involved testing and debugging. But even at that, more than half of all bugs aren't found until the end of the development cycle, or when software is in user's hands. At minimum, the study reports, as much as $22.2 billion per year could be saved by improved testing to identify and remove bugs earlier.
On the upside, bugs make good headlines. For instance, the bug in the on-board guidance computer of Europe's Ariane 5 rocket led to the destruction of the rocket, costing more than $1 billion. Then there was the infamous priority-inversion bug on the Mars Pathfinder spacecraft (see "A Conversation with Glenn Reeves," DDJ, November 1999). And, of course, the granddaddy of them allthe Year 2000 bug, which really wasn't a bug at all. (The Y2K data compression and ultimate storage overflow worked exactly as its designers and programmers knew it would; they just didn't think it would still be in use in 2000.)
On the other hand, the bug responsible for last year's electrical blackout, which cut off electricity to 50 million people in eight U.S. states and Canada, really was a bug. Now referred to as the "XA/21 bug," the problem in General Electric's XA/21 energy-management system was finally uncovered in a code audit.
"This fault was so deeply embedded, it took weeks of pouring through millions of lines of code and data to find it," said a GE spokesperson. The bug caused audible and on-screen alarm systems to fail, thereby slowing human response. Backup servers then failed because they couldn't handle unprocessed events queued up when the main system went down. All in all, operators didn't know for more than an hour that they were viewing outdated information because the system crashed silently.
There are all sorts of reasons for bugs in software, including that software is becoming more and more complex. Programs are now measured in millions of lines of code and are distributed among multiple machines worldwide. As well, management stares at 80 percent of development costs going for testing and debugging, but doesn't understand where the return on investment is. Finally, there are the programmers themselves, who are often rushed, overworked, or sometimes not up to the job because they don't read Dr. Dobb's Journal.
Sure, debugging tools have come a long way, and most of us have a favorite debugger. But the tools are only as good as the techniques used when working with them. One of the best sources of debugging techniques is John Robbins's classic, but out-of-print, Debugging Applications (Microsoft Press, 2000), updated to Debugging Applications for Microsoft .NET and Microsoft Windows (Microsoft Press, 2003). Robbins, who writes the "Bugslayer" column for MSDN magazine and was an engineer for Compuware/NuMega's BoundsChecker and SoftICE debugging tools, does a masterful job at codifying essential debugging techniques and strategies.
While it doesn't have as many pages, but does have a longer title, David Agans's Debugging: The Nine Indispensable Rules for Finding Even the Most Elusive Software and Hardware Problems (Amacon, 2002) is, well, indispensable. Of course, as new languages and platforms emerge, Agans may need to add a new rule or two. When it comes to programming bugs, unlike Rhyniognatha, nothing is set in stone.
Jonathan Erickson
editor-in-chief
jerickson@ddj.com