Horse Stalls and Barndoors

Dr. Dobb's Journal November, 2005

By Michael Swaine

Michael is editor-at-large for DDJ. He can be contacted at mike@swaine.com.
In a recent issue of The New Yorker entirely sponsored by Target—and there's a phrase I'd have bet good money I'd never have occasion to type—the entire "Letters" section was given over to reminiscences about nuclear war strategist Hermann Kahn of the RAND Corporation, arguably the father of modern Neocon political philosophy.

Part of the necessary loss of innocence that comes with taking on the responsibilities of full personhood is discovering the clay feet of father figures, and apparently most of Kahn's scary advice on nuclear deterrence was based on deeply flawed intelligence about Soviet nuclear capability. Doesn't that sound familiar?

Kahn was an imposing character. One letter writer remembers how awed he was, 50 or so years ago, seeing Kahn playing kriegspiel on the RAND patio with polygenius Johnny von Neumann—although he was less impressed in a later encounter.

"I questioned one of his startling assertions and asked for his source. He said, 'I'll get back to you on that.' At the next meeting, I asked him again. He said, 'Newsweek.'"

The RAND Corporation, no relation to early computer manufacturer Remington Rand, was the big post-WWII U.S. government think tank whose alumni include Donald Rumsfeld and Daniel Ellsberg, as well as the aforementioned von Neumann and Paul Baran, arguably one of the fathers of the Internet. Thinking about RAND and Kahn led me to thinking about World War II and the technological imperative and whether technologies really have fathers (or mothers), and about the atomic bomb and the Manhattan Project and technology's loss of innocence.

The same themes also came to mind as I was reading Mike Hally's Electronic Brains: Stories from the Dawn of the Computer Age (Joseph Henry Press, 2005; ISBN 0309096308). But I'll get back to that. First a bit about railroads and Apple and Microsoft and Google.

The Technological Imperative

The technological imperative is a phrase with at least two meanings: It can be normative, meaning that whatever we can do we should do. But it more often means something vaguely mystical: that technology occurs when it is ready to occur, on its own, making use of whatever genius—or bicycle mechanic—happens to be handy.

I find this a little hard to buy, this mystical idea that technological innovations and discoveries and innovations just sprout naturally as if they were the mushroom caps of some subterranean technological mycelium. But it's a recurring meme in the history of technology.

The aforementioned Johnny von Neumann believed in the seeming inevitability of technology, saying, "technological possibilities are irresistible to man." Science-fiction writer Robert A. Heinlein, paraphrasing the great collector of oddities, Charles Fort, put it this way: "You railroad...when it comes time to railroad." Software designer Ned Gulley points out that "Moore's Law...has become a mythological imperative with the apparent force of physical law."

And a certain phrase appears again and again in literature about technology:

"If the Wright Brothers hadn't figured out how to fly, someone else would have."

"If Bill Gates hadn't done it, someone else would have."

"If we hadn't done it in 1971, someone else would have invented the microprocessor in a year or two" (Intel's Ted Hoff).

"If we hadn't done it, somebody else would have" (the Manhattan Project's Robert Dickey).

"If I hadn't done it, someone else would have" (the Third Reich's Adolph Eichmann).

But isn't there something creepily mystical about such assertions?

Sure, some other inventor would have come up with a successful heavier-than-air aircraft if the Wright brothers hadn't. Other inventors had been trying hard to solve that well-defined problem for a generation and no new science was needed for the breakthrough. But as a general principle, the somebody-else-would-have-done-it claim seems like mysticism.

Still, an aggregate of unpredictable random acts of creativity can lead to a breakthrough that looks inevitable after the fact. And the statistician kills with the death of a thousand paper cuts. Maybe there is something to this technological imperative.

Apple, Microsoft, and Google

If the technological imperative is some sort of law, in the sense of Moore's Law, I can think of a few more technological laws, like the Barn Door Principle. It doesn't do any good to lock the barn door after the horses have run away.

Apple seems to think that it can keep people from running Mac OS X on non-Apple hardware now that its moving to Intel. Microsoft seems to think that it can retain the kind of control it had over its customers as it moves many of the capabilities of its operating system to the Internet.

Both companies may be about to be smacked by the barn door.

Then there's Google.

Google has expanded beyond search into e-mail, mapping, blogging, digital photo management, books, instant messaging, and desktop tools. It may soon enter the business of providing services to mobile phone users, may go into direct competition with PayPal and/or Amazon. And it might release its own browser or even its own operating system—or some collection of services and tools that shifts the locus of control away from the operating-system provider.

"Google is aiming for that most coveted position in technology: a platform that...is so popular that outside software developers write programs...that render the [platform] indispensable..."

And recently, the company once so admired by all fired an employee for joking online that the company perks were a ploy to keep people at their desks longer. And it announced that it would not talk to any CNet reporter for a year because it objected to an article a CNet reporter wrote.

Is there a law that says that companies get bad when they get big? Is Google turning into Microsoft, in terms of both its market ambitions and its business ethics?

Was WWII the Father of the Digital Computer?

Time-to-railroad thinking runs on another track from technological paternity claims. In Electronic Brains, Mike Hally gives some time to both views.

Letting the second view borrow a little from the first, an argument could even be made that the father of the modern general-purpose stored-program digital electronic computer was World War II. By no less specious logic, it fathered me and the rest of the Baby Boomers, not to mention the atomic bomb and the RAND Corporation and the Internet and the career of Donald Rumsfeld.

Hally doesn't make that argument, but he does emphasize the influence of WWII on the early development of the computer. His contention that ENIAC couldn't have been developed without war-level government backing ("It is very unlikely the ENIAC would have been built without the army's support") is convincing. He's equally convincing that John Atanasoff's ABC computer probably would have been developed to completion if the war had not interfered.

He's less convincing, I think, when he dismisses the idea that the ABC computer would have fueled the kind of technological growth that ENIAC did.

I think his speculation about the computer industry absent WWII raises an interesting line of thought: Assume that WWII—and as a result ENIAC—hadn't happened. Stipulate that it was time to railroad—er, compute. Then what form would the first computers have taken? What form would the computer industry have taken? Would something like Atanasoff's simpler approach have been the model to follow? Without the largesse of the Department of Defense, would research have pushed harder in the direction of reducing cost and size? Would we have bypassed the wall-sized monsters and got to microcomputers more directly and quickly?

Hally doesn't address that speculation. What he does is tell some interesting stories from the dawn of the computer era, telling them well and, in some cases, more or less for the first time. His book grew out of a series of BBC radio shows based on interviews he did with computer pioneers on three or four continents. To produce the book, he expanded on that material and supplemented it with entirely new work on IBM, Australian computing, and Alan Turing. In my opinion, the BBC-based material makes up the most interesting parts of the book.

Post-War Stories

The book is best viewed as a collection of stories; it's not a complete and even-handed history of its period—the decade or so after WWII. And some of the stories are all too familiar. I don't know that Hally has anything fresh to say about Charles Babbage and Ada Lovelace, or on other bits of computer history that he draws from existing, available sources. But when he's drawing on his own interviews, he pulls out some intriguing details.

He does more than that in the case of the Rand 409 computer. No one else seems to have written about this important early computer. No books, he says, or even chapters of books. Hally has now told the story, based on interviews with several Rand engineers.

Briefly, in 1943, James Rand, president of the business machine company Remington Rand, was blown away by a pitch from an engineer from a rival company. The engineer, Loring Crossman, wanted to come to work for Rand and build a general-purpose electronic digital computer to process business data. At the time, ENIAC and Atanasoff's ABC computer were still under development, unknown to Crossman, and were special-purpose devices anyway. Even when built, ENIAC was nothing you would want to put in an office to do routine business processing. What Crossman was proposing was revolutionary on several fronts. Electronic. Digital. General purpose. A business computer.

Hally goes on to tell how Rand bought Crossman's pitch, and how Crossman actually delivered what he had pitched, and of the engineers who made it happen. Along the way, Rand pulled some big names onto the project masthead, including retired General Leslie Groves, who had run the Manhattan Project, and General Douglas MacArthur, Supreme Allied Commander of the Southwest Pacific theater during World War II, who seemed chiefly concerned with getting his picture taken and noting what time the engineers arrived at work in the morning. It wasn't clear whether this was before or after MacArthur took on the management of the Korean War, proposed dropping 50 atomic bombs on China, and was fired by President Truman.

The Rand 409 operation was housed in a former barn in Rowayton, Connecticut, and the engineers' cubicles were former horse stalls. The Rand 409 computer itself was designed, built, and went on sale in 1950. It was attractive at around $100,000, roughly a tenth the price of a Univac. The first Rand 409 was delivered to the IRS in 1951, and soon Remington Rand was delivering a 409 every week. Hally thinks that it has a good claim on the title of first business computer.

The interview process served Hally well in the Rand 409 chapter.

The original material in his chapter on British computing and the individual whom some call the father of the computer, Alan Turing, was drawn from an interview with British computer pioneer Maurice Wilkes, and is definitely worth the read. But Hally finds himself wondering why Wilkes hasn't received the attention that Turing has, and he sets out to correct that error. Here it could be argued that the interview process may be biasing the interviewer, since Turing wasn't around any longer to present his case.

In the chapter on the Lyons computer, I am not sure what happened, but I doubt that Hally really wanted to spend all those pages describing a British catering company. Maybe it sat well with the original BBC audience, but it seems odd in a book about early computers. Granted, the company did develop its own computer, and even sold computers to the Soviet Union. There's interesting material in the chapter, once you get through the tutorial on large-event catering.

Of course, the Soviet Union had its own computers. Hally devotes a fascinating chapter to Soviet computing—more precisely, Ukranian computing—and it's full of material that I hadn't come across before.

And then there's MONIAC.

This chapter doesn't really belong in the book, I suspect, since it's about an analog computer. But it would have been a shame to leave it out.

MONIAC was a bizarre contraption of clear tubes and boxes with colored liquid flowing through them, all to model the flow of money in a national economy. New Zealand inventor Bill Phillips figured out that economic models were just recycled hydrodynamics, and capitalized on that insight to build a machine that allowed economists and students and businessfolk to fiddle with one economic variable and see, dynamically, the effect on the entire economy. It was even better than a spreadsheet because it was more dynamic: you actually saw the ramifications of a change flow around the system. Specific economic theories could be tested or demonstrated with MONIAC. It even resolved a dispute between two leading economists. The Ford Motor Company and the Government of Guatemala bought MONIACs. Later, Phillips showed that you could model international economies as well as national economies by hooking up two MONIACs and feeding the exports of one to the other as imports.

Phillips wasn't an economist when he built MONIAC, but when he showed it to economists and they learned how accurate it was, they embraced him, degree or no. Before long, he became one of the leading economists of the 1950s.

But it is what he was doing before the 1950s that makes up some of the most colorful reading in the book. Phillips was what you'd call a character.

That Curious Character

Speaking of characters, the canonical characterization for Richard Feynman is "curious character," and I am unreservedly pleased that Ralph Leighton has recollected his previous collections of Feynmanisms into a new omnibus volume called Classic Feynman: All the Adventures of a Curious Character (W.W. Norton & Co., 2005). The book contains, along with the content of the major pair of Leighton's previous Feynman books, three significant new elements: a "Prologue" by Freeman Dyson, an "Afterword" by Alan Alda, and a CD of Feynman in his own voice. I had the privilege of reading an advance copy, with the downside that I'll have to wait along with you for that CD.

I know you know who Richard Feynman was, and those two other names are doubtless familiar, but it is just possible that you don't know what Alan Alda is doing in a book on Richard Feynman. Although that would mean that you don't know about Alda's intelligence, his interest in and commitment to science, his hosting of Scientific American Frontiers, and the fact that, after reading Ralph Leighton's books filled with Feynman's words and thoughts, Alda produced and starred in a play about Feynman called "QED," and that would be too bad.

The "Afterword" is Alda's 2002 Cal Tech commencement address. In it, he talks about the challenge of capturing in one play the single essence of a man who was "a revered teacher, a bongo player, an artist, a hilarious raconteur, [and] a safecracker." He toys with the idea of using a Feynman approach to physics in order to approach Feynman: the sum over histories. "Just as Feynman saw a proton taking every possible path on its way to your eye, Feynman himself took every possible path on his way through life." Alda wisely rejected the approach as bad theater, but it's an entertaining idea.

In his "Prologue," Dyson points out that Feynman's dazzling originality was not in the substance of his work, but rather in its style. Feynman did not develop new realms of physics, but he developed new ways of seeing physics—and other aspects of life as well. In fact, really seeing and communicating clearly what he saw was Feynman's thing—and Alda clearly got that across in his portrayal of Feynman.

And that originality of style was true originality. Harking back to a theme I introduced at the beginning of this column, it seems safe to say that if Murray Gell-Mann had not come up with the theory of quarks, somebody else would have: George Zweig or Feynman himself, most likely. But if Feynman hadn't come up with his space-time approach and his wonderful diagrams for picturing particle interaction, it's a good guess that nobody else would have. Feynman was a true original.

DDJ