The history of computing in general, and of personal computing in particular, is one of my little hobbies, and I'm fairly compulsive about buying books on this topic whenever they appear. The trade literature is surprisingly sparse in this area, and the scholarly literature not much better--not too surprising, I guess, because the intact, working relics of the pioneering era are even less common. It's pathetic how many of the artifacts of early digital computing have already been consigned to the junkpiles and paper shredders, in spite of people such as Gwen Bell at the Boston Computer Museum who have been making an earnest effort in recent years to recover and safeguard some of the most precious machines, program listings, and documentation. I'm as guilty of this heedless destruction of our heritage as anyone. When I owned my first Imsai 8080 computer, I had a copy of the source code for Gates and Allen's original 4K ROM Basic, and I threw it away when I sold the Imsai because I couldn't imagine ever needing it!
The two trade books of choice, if you are interested in this subject at all, are Steven Levy's Hackers and Paul Freiberger and Michael Swaine's Fire in the Valley. Both are vivid, fluently written, predominantly accurate accounts, but have a somewhat different emphasis: Hackers begins with MIT's Tech Model Railroad Club, the artificial research research lab of John McCarthy, SpaceWar and semilegendary, compulsive, social-misfit programmers such as Bill Gosper and Richard Greenblatt. About halfway through the book, Hackers arrives at the Silicon Valley Homebrew Computer Club and personalities such as Lee Felsenstein, Jim Warren, Bob Albrecht, Dennis Allison, Ted Nelson, Bill Gates, Stephen Wozniak, and Steven Jobs--which is pretty much where Fire in the Valley picks up the story. Fire in the Valley is a relatively practical book with detailed coverage of early microcomputing retailing (such as it was), business relationships, operating systems, and "productivity software." Hackers spends much more time talking about the Apple and Atari game empires of companies such as Sierra On-line, and hearkens back fondly to a golden age of "The Hacker Ethic" which, I suspect, existed mai ly in a few idealistic people's imaginations--much like the Camelot of John F. Kennedy.
The sheer talent of some of the characters in Hackers and Fire in the Valley can strike awe into your heart, and the sheer melodrama of Hackers may bring a smile to your lips, but both books are rather sad as well. So few of the original programmers, engineers, and entrepreneurs portrayed by Levy, Freiberger, and Swaine have been sufficiently well-rounded, broadly educated, and financially savvy to maintain control over their own inventions, their own companies, or even their personal destinies. The fate of Richard Stallman, one of McCarthy's hacker proteges, is a depressing example. Stallman, widely admired as a virtuoso coder for his EMACS editor, is apparently content to remain an angry iconoclast and spend the rest of his life tilting at windmills. At present, he is furiously rewriting a 20-year-old operating system so that he can give it away to spite AT&T--while the rest of the world moves on to new operating system architectures, new programming paradigms, and new user interfaces.
Robert Slater's Portraits in Silicon contains 31 thumbnail sketches of computer hardware and software pioneers. The book is heavily weighted toward engineers, mathematicians, and physicists, beginning with Charles Babbage, but it does include a healthy cross-section of latter-day microcomputer programmers as well. Portraits in Silicon is worth its purchase price for the chapter on Konrad Zuse alone. Zuse is a brilliant German engineer who evidently deserves the title of inventor of the general-purpose, programmable digital computer at least as much as Atanasoff, Mauchly, or Eckert. Zuse built his first working machine in 1936-1938, and his fourth machine--constructed in Berlin on a shoestring budget during 1943-1945--eventually found a home in the Technical Hochschule in Zurich, Switzerland and remained in use well into the 1950s. Unfortunately for Zuse, but fortunately for the Allies, the German government was oblivious to the importance of Zuse's work, and residuals of his designs do not survive in any of the mainstream computer architectures of today.
Slater's book, while valuable, must be read with caution for several reasons. First, Slater, who is a Time magazine correspondent, seems to have virtually no understanding of the technical implications of his material. Most of the time, due to the excellence of his technical reviewers, he gets away with it, but occasional paragraphs on hardware specifics are conspicuously confused. Second, when the stories provided by his interviewees are in obvious conflict, Slater doesn't want to hurt anyone's feelings or step on anyone's toes, so the reader ends up with no clear idea of where the truth actually lies. Third, the historical accuracy of the entire book is thrown into question by blatant deficiencies in the microcomputer-related chapters. For example, in the biography of Bill Gates, Slater writes:
IBM asked Gates to design the operating system for the new machine--what would become the incredibly popular PC...Gates went to work on what would become MS-DOS--Microsoft Disk Operating System. He filed his report personally in Boca Raton in September, and in November he got the contract... Gates chose a small room in the middle of Microsoft's offices in the old National Bank building in Bellevue, and got to work.
Incredibly, Slater fails to describe the role of Tim Paterson, Seattle Computer Products, or 86-DOS in the genesis of MS-DOS 1.0--even though this role is common knowledge throughout the industry. For that matter, he doesn't even mention any of the Microsoft employees that were involved with the first few releases of MS-DOS: Bob O'Rear, Chris Peters, Chris Larson, Aaron Reynolds, and Mark Zbikowski, among others.
Susan Lammers's book, Programmers at Work, must surely be one of the most readable, sympathetic books about software developers ever published. Lammers was not interested in muckraking or confrontation, but rather made it her goal to explore the personalities, attitudes, and work habits behind some of the great personal computing products of the mid-eighties. There is no pretense at objectivity or fact-finding in this book--the interviewees were basically allowed to take the discussions in any direction they wished and edit the transcripts to present themselves in the most favorable light--but the book is quite revealing nonetheless. Several of the chapters literally reek of hubris (I'll let you discover these for yourself), while others--such as the interviews with Gates, Carr, and Hertzfeld--are unexpectedly disarming.
One of the more intriguing features of Lammers's book is the inclusion of design jottings or source code by the various programmers she talked to. We see pages from the manuscript from the original Visicalc user manual by Dan Bricklin, notes to himself by Robert Carr on the data structures underlying Framework, excerpts from the source code for 8080 Basic and a couple of amateurish articles from the MITS user newsletter by Bill Gates, a full-fledged animation demonstration program for the Macintosh by Andy Hertzfeld, an elegant page concerning character grey-scaling from one of John Warnock's notebooks, and a PDP-10 assembly-language program for compiling wirelists by Charles Simonyi--using Hungarian notation, of course! A lot of water has passed over the dam since Susan Lammers put this book together in 1986, but it's still a good investment of your time and money.
Accidental Empires, by Robert X. Cringely, is the very antithesis of Programmers at Work. Accidental Empires is classic yellow journalism--unsubstantiated anecdotes, mangled facts, inflammatory speculation, and half-baked freshman psychology all masquerading as history. This book could just as well have been named Fear and Loathing in Cupertino! The chief targets of Cringely's vitriol are Bill Gates and Steve Jobs, although a good many other industry luminaries, ranging from Donald Knuth to John Warnock, receive glancing swipes of the claws as well. This is not to say that the book isn't entertaining--entertainment and making money are, after all, the primary goals of yellow journalism--or even that it doesn't contain some valuable insights. But Accidental Empires is ultimately doomed to irrelevance because its author, or authors, didn't have the integrity to offer their interpretations of our industry under their true names. A pity.
Then the Popular Electronics article came out. Gates' friend Paul Allen ran through Harvard Square with the article to wave it in front of Gates' face and say, "Look, it's going to happen! I told you this was going to happen! And we're going to miss it!" Gates had to admit that his friend was right; it sure looked as though the "something" they had been looking for had found them. He immediately phoned MITS, claiming that he and his partner had a BASIC language usable on the Altair. When Ed Roberts, who had heard a lot of such promises, asked Gates when he could come to Albuquerque to demonstrate it, Gates looked at his childhood friend, took a deep breath, and said, "Oh, in two or three weeks." Gates put down the receiver, turned to Allen and said: "I guess we should go buy a manual." They went straight to an electronics shop and purchased Adam Osborne's manual on the 8080.
Not long before MITS began shipping Altairs to computer-starved Popular Electronics readers, Ed Roberts had gotten a phone call from two college students named Paul Allen and Bill Gates. The two teenagers hailed from Seattle. Since high school the two of them had been hacking computers... The Altair article, while not impressing them technically, was exciting to them: it was clear microcomputers were the next big thing, and they could get involved in all the action by writing BASIC for this thing. They had a manual explaining the instruction set for the 8080 chip, and they had the Popular Electronics article with the Altair schematics, so they got to work writing something that would fit in 4K of memory. Actually, they had to write the interpreter in less than that amount of code, since the memory would not only be holding their program to interpret BASIC into machine language, but would need space for that program that the user would be writing. It was not easy, but Gates in particular was a master at bumming code, and with a lot of squeezing and some innovative use of the 8080 instruction set, they thought they'd done it. When they called Roberts, they did not mention they were placing the call from Bill Gates' college dorm room. Roberts was cordial, but warned them that others were thinking of an Altair BASIC; they were welcome to try, though. "We'll buy from the first guy who shows up with one," Roberts told them.
Like the Buddha, Gates' enlightenment came in a flash. Walking across Harvard Yard while Paul Allen waved in his face the January 1975 issue of Popular Electronics announcing the Altair 8800 microcomputer from MITS, they both saw instantly that there would really be a personal computer industry and that the industry would need programming languages. Although there were no microcomputer software companies yet, 19-year-old Bill's first concern was that they were already too late. "We realized that the revolution might happen without us," Gates said. "After we saw that article, there was no question of where our life would focus."
"Our life?" What the heck does Gates mean here--that he and Paul Allen were joined at the frontal lobe, sharing a single life, a single set of experiences? In those days, the answer was "yes." Drawn together by the idea of starting a pioneering software company and each convinced that he couldn't succeed alone, they committed to sharing a single life--a life unlike that of most other PC pioneers because it was devoted as much to doing business as to doing technology. Gates was a businessman from the start; otherwise, why would he have been worried about being passed by? There was plenty of room for high-level computer languages to be developed for the fledgling platforms, but there was only room for one first high-level language. Anyone could participate in a movement, but only those with the right timing could control it.
After Gates entered Harvard in the fall of 1973, Allen challenged him to developed a BASIC interpreter for the Intel 8008, but Gates soon decided that the 8080 instruction set was not powerful enough for BASIC. Allen next urged that they start a microcomputer firm. The two had already spent $360 to purchase one of the very first microcomputer chips. The turning point in their young careers came when they read the January 1975 issue of Popular Electronics. The Altair microcomputer, based on the 8080 chip, made by an Albuquerque, New Mexico firm called MITS, and selling for $350, appeared on the cover. Allen was the first to see the article. He noticed a copy of the magazine at the newsstand and hastily tracked down Gates. Here was the first truly cheap computer! Allen ran through Harvard Square waving the article in front of Gates, issuing a friendly warning that the train was leaving, and if the two of them didn't get to work, they would not be aboard. Gates' problem was whether to stick to his present studies in pursuit of the legal career his parents wished for him, or give full attention to computers. The latter won out; the two young men wanted to make sure they wouldn't miss what was happening. "We realized," Gates recalls, "that the revolution might happen without us. After we saw that article, there was no question of where our life would focus." Allen proposed to Gates that the two try to write a BASIC--the simple, high-level computer programming language--for the Altair. At least one minicomputer firm had insisted that it was impossible to write a high-level language that would run on a personal computer. But the two young men wanted to give it a try. They informed MITS of their plan.
The really great programs I've written have all been ones that I have thought about for a huge amount of time before I ever wrote them. I wrote a BASIC interpreter for a minicomputer in high school. I made massive mistakes in that program, and then I got to look at some other BASIC interpreters. So by the time I sat down to do Microsoft BASIC in 1975, it wasn't a question of whether I could write the program, but rather a question of whether I could squeeze it into 4K and make it super fast... Paul Allen had brought me the magazine with the Altair, and we thought, "Geez, we'd better get going, because we know these machines are going to be popular." And that's when I stopped going to classes and we just worked around the clock. The initial program was written in about three and a half weeks. We ended up spending about eight weeks before I had it fully polished the way that I really liked it. And then I later went back and rewrote it. No great programmer is sitting there saying, "I'm going to make a bunch of money," or "I'm going to sell a hundred thousand copies." Because that kind of thought gives you no guidance about the problems. A great programmer is thinking, Should I rewrite this whole subroutine so that four people, instead of three, could call it? Should I make this program ten percent faster? Should I really think through what the common case in here is so I know how to order this check?
Copyright © 1992, Dr. Dobb's JournalFive Views of the Same Event
From Fire in the Valley
From Hackers
From Accidental Empires
From Portraits in Silicon
From Programmers at Work (Gates speaking to Lammers)