FIRE IN THE VALLEY REVISITED

Where there's smoke, there's bound to be fire

Michael Swaine

Michael is DDJ's editor-at-large and coauthor of Fire in the Valley, a landmark history of the personal computer industry, published by Osborne McGraw-Hill. He can be reached at 501 Galveston Drive, Redwood City, CA 94063.


History is counted among the humanities, rather than, say, among the sciences. I never thought much about this categorization until I tried to write some history. Paul Freiberger and I were editors at InfoWorld in 1981 when our boss, Thom Hogan, suggested that we interview the heroes of the personal computer revolution while all the frantic events were fresh in their memories, and write a book that would convey the facts and the feelings of the revolution.

By the time we had finished Fire in the Valley I knew which of the meanings of the word human motivated the classification of history among the humanities. The humanities are the human studies, and history is a human study in the sense of to err is human. We did a lot of choosing among alternative versions of events in researching Fire in the Valley, often deciding what must have been true on the basis of plausibility. The judgment of plausibility is, of course, highly subjective.

We were not alone in making such subjective choices. We wrote a history of a revolution, but history itself appears to be of many minds about the nature, duration, and very existence of that revolution or, as it may be, those revolutions. I have since read history books and articles that contend that the revolution hasn't happened yet, that it was over in 1974, that there hasn't been any revolution, and that there have been several. It appears that we can take our choice of histories. I choose to say that there was a revolution, and that it's over, and that we won.

Now that the recent glorious war of liberation is over, it is time to celebrate our heroes, decorate the battle sites, and begin the job of civilizing the wilderness and creating the new society for which we fought.

Revolutionary Decade: 1974 - 1983

Just as the history of the American revolution does not begin with the Magna Carta, I won't hark back to Larry Tesler and Jeff Rulifson's readings in semiotics at Xerox PARC, or Doug Engelbart's blueprints for user-friendly design at SRI, or that bizarre community of neologistic technophiles at MIT.

One event preceding the main event must be mentioned, though: The 1974 self-publication by Ted Nelson of Computer Lib/Dream Machines. Computer Lib was the Common Sense of the revolution, and Nelson its Tom Paine. It's a book still worth dipping into, which is, in fact, the only way in which it is possible to read the book.

But the revolution actually broke out over Christmas break, 1974-75, with the publication of an article on a kit computer from a firm in Albuquerque. Specifically, it was the hobbyist phase of the personal computer revolution that began when that January 1975 issue of Popular Electronics hit the stands, with its cover story on the MITS Altair computer. There were other fronts on which the revolution could have broken out: Any electronic hobbyists who could lay hands on a microprocessor could attempt to build a computer, and some did. Stephen Wozniak, whose first attempt in 1971 went up in smoke, was only the best known of them, but in the Southwest, the Northwest, in New Jersey and New York and Massachusetts and California, hobbyists were creating the personal computer. Don Lancaster demonstrated how to build your own terminal in 1973, and Radio Electronics published a piece on the Mark 8, your personal minicomputer, in 1974.

But even though the Altair featured on the Popular Electronics cover was one of many, and was in fact only a mock-up, it was the shot heard 'round the hobbyist community, because it promised a computer on your desktop for under $500. The average hobbyist couldn't buy the parts for that price. The Altair started the revolution.

In 1975, computer clubs sprang up around the country; groups in southern California, Silicon Valley, and New Jersey would be especially influential, but the Boston Computer Society, founded two years later by Jonathan Rotenberg, would prove more enduring. Dozens of small companies sprang up overnight, growing to hundreds within two years. Computer retailing began in 1975 at Dick Heiser's Computer Store in Los Angeles, and by 1977 the first computer store franchise chain had been launched: It was called Computerland. Computer magazines evolved out of user group newsletters, with Wayne Green's Byte rapidly achieving preeminence among the hundreds that existed at their peak. Personal computer shows became popular. Software companies were launched, starting with Microsoft, which wrote the Basic for the Altair, and Digital Research, which created the CP/M operating system. Languages and operating systems were the first kinds of software needed for the machines, and Dr. Dobb's Journal was launched in 1976 to put a Basic interpreter in the hands of hobbyists.

The hobbyist phase in its purest form came to an end in 1977, the year Apple opened offices in Cupertino and put an assembled and slick-looking Apple II on the market. This was also the year that the big electronics companies (Tandy/Radio Shack and Commodore) announced their first PC, and the year MITS effectively went under.

By 1978 it was possible, with a lot of imagination, to consider using a personal computer for business purposes. Disk drives were replacing cassette recorders for storage. There were word processors and databases and accounting programs. By the end of 1979, there was a very good word processor called WordStar, and an innovative electronic spreadsheet program called VisiCalc. You could choose among IMSAI and Cromemco and Processor Technology machines, or bypass the hobbyist standard S-100 bus and buy an Apple II, Radio Shack Model I, Commodore PET, Exidy Sorceror, or a computer from video game manufacturer Atari. By 1980, Radio Shack, Texas Instruments, and Sinclair had all introduced very lowcost computers and made it reasonable for non-fanatics to satisfy their curiosity about personal computers. Most buyers were hobbyists and most software fell into the category of utilities or games through 1978 and 1979, but most of the companies understood that the personal computer would be an extremely important business tool, and were fighting to establish a serious reputation and a position in the market before some mainframe computer company got into the game and changed the rules.

Apple had emerged as the leading personal computer company by 1980. The machine was well designed, the company was skillfully managed, and Apple took upon itself the task of marketing the concept of a personal computer to a skeptical general public. It didn't hurt that VisiCalc, the first program that could sell a computer, ran only on an Apple II. Apple management was acutely aware of the need to get big enough to survive when the big company stepped in, and it was becoming clear that the biggest company would soon take the plunge.

Microsoft began developing an operating system for IBM in 1980. Hewlett-Packard had entered the fray in 1980, Xerox came in 1981, and DEC announced a line of personal computers in 1982. But it was the IBM PC, announced in mid-1981, that launched the next phase of the personal computer revolution.

From 1981 to 1984, most personal computer companies were concerned with IBM PC compatibility. Some companies didn't get it right, and failed. Some, such as TI, decided they could do without the competition of IBM, and got out of personal computers. But many hardware companies and many more software companies prospered in the new air of legitimacy that IBM brought to the market. Lotus, Ashton-Tate, and Microsoft became big companies. It was the Big Blue phase, but not because IBM dominated the market in sales. IBM simply defined what a PC was during this period.

That's why it is so significant that the IBM PC was an open machine. IBM could have released a proprietary machine at a premium price, differentiated it clearly from the computers that had sprung up out of the hobbyist community, and sold lots of computers into its loyal business user base. Instead, it produced a machine with an open architecture running a third-party operating system and third-party application software, all derived from that hobbyist community. It permitted competitors to develop IBM-compatible machines to the point where the clone makers had a greater collective market share than IBM. This was not what anyone had expected from IBM, and some credit may be due to Bill Gates for nudging IBM in the direction of an open architecture. But it was IBM that decided to talk to Microsoft in 1980.

Meanwhile, Apple was presenting itself as the guardian of the computer power to the people spirit of the hobbyist phase while building a new machine that would feature an architecture so closed that it was even hard to find a tool with which to open the case. The irony was not lost on the savvy personal computer community.

After the Fire

That's how things stood at the outset of 1984. That January, it looked as though the big news of the year in the computer industry was going to be a commercial: The legendary Ridley Scott SuperBowl ad for the Macintosh. The Mac itself was big news, and the Canon laser engine was an important technical advance that year. A few other things happened, too.

MS-DOS 2.0 was released and Bill Gates made the cover of Time. AT&T introduced its first personal computer. The press was chanting shakeout, and Business Week crowned IBM the winner and sole survivor with 26 percent market share. Jack Tramiel resigned as president of Commodore, then bought Atari, vowing to turn this company from a democracy into a dictatorship. In 1984, the line workers in Silicon Valley computer companies made from $3.50 to $11.50 an hour, as compared with the $1.20 hourly average earned by Taiwanese workers. The companies behind VisiCalc, Software Arts, and VisiCorp, got into a legal wrangle that had aspects of a mutual suicide pact.

This was the year when all the windowing systems for PCs were vying for attention. IBM introduced TopView, which briefly looked like a competitor to Microsoft Windows, QuarterDeck Desq, and a rumored DRI product. Jim Fawcette, editorial director of InfoWorld at the time, characterized TopView as a program for expert users of novice programs. TopView was pretty bad, but it was from IBM, and a lot of IBM watchers thought that it must be part of an IBM plan to proprietarize the IBM operating system. DRI introduced GEM, which was praised in the press for bringing a Mac-like interface to the PC. The irony of that praise would soon be apparent.

Americans polled by the Louis Harris organization in 1984 said that they viewed computers as capable of making life easier and better, but they also feared that computers might take their jobs and undermine their privacy. At the same time, some users of on-line conferencing systems claimed that their privacy was being violated when magazines reprinted their electronic comments. The Harris poll supported the idea that a schism of computer haves and have-nots was growing. The Supreme Court upheld the right to record video programs on our Betamaxes, as an InfoWorld story put it. The decision held up even though "Betamax" has the ring of "Tucker" today.

Spurred on by the movie War Games and the arrest of the 414s in 1983, the FBI put into place a three-week training program for agents: They went in ignorant and came out fully qualified to investigate computer crimes. That year the FBI raided the homes of four Huntsville, Alabama teenagers suspected of cracking NASA's Space Physics Analysis Network, which had little in the way of security and no classified or sensitive information.

It wasn't the FBI, but a Los Angeles detective and two Pacific Telephone security officers who ripped out Tom Tcimpidis's BBS hardware. Someone had posted a credit-card code and two Sprint access numbers on Tcimpidis's bulletin board, and when the Pac Tel police arrived at the door, they made Tcimpidis the computer community cause celebre of the year. The judge in the case said that he understood the issues involved, having seen War Games. Pac Tel eventually dropped the case, but made it clear that it might prosecute other bulletin board operators.

By the fall, the House of Representatives was examining a computer crime bill that would make unauthorized access to a computer system a felony if it resulted in a $5000 loss to a victim or an equal gain to the cracker.

Nibble magazine either clarified or clouded the waters when it won an injunction against a program-typing service that wasn't doing enough typing. Nibble was publishing code in its pages for its readers to key in and use. The typing service, Amtype, ostensibly charged $10 for typing the code for people, sending them a disk with the code on it. In fact, they were typing the code once and running off copies. Had they actually typed the code each time, Nibble argued, that would have been legal, but what they were doing was a violation of copyright.

Massachusetts Congressman Edward Markey launched an online discussion of nuclear weapons. Many candidates were using personal computers in their campaigns, a few even going online to get input from computer-using constituents.

1985 was a bloody year. Software Arts died of self-inflicted wounds, Lotus Development Corp. acquiring its product line, founders, and reputation for having invented the electronic spreadsheet. DRI, facing hard times, cut back on staff. Counting on GEM to pull it out of its slump, it lost some momentum when Apple forced it to redesign the product to look less Mac-like. Control of Apple computer was wrested from Steve Jobs, who left, selling off stock, and taking key people with him to start a new company. Most industry leaders viewed this as a positive step for Apple, although Philippe Kahn said the only people left are the bean counters.

The U.S. Patent and Trademark Office, which formerly had a policy of not granting patents for software, began taking applications for software patents seriously, as directed by the Supreme Court in decisions earlier in the decade. Several applications were under scrutiny, and Number 4,555,775 was granted to AT&T for a hardware/software display system it called windows. AT&T's windows appeared to be a fairly specific invention, not related to windowing systems.

DDJ did some beard-pulling in 1985, finding weaknesses in Turbo Pascal, showing how to add both memory and a hard disk to the closed Macintosh, and publishing Richard Stallman's GNU Manifesto.

And so it went. In 1986 Intel laid off 700 employees, IBM unveiled its RISC-based RT PC, and credit card-sized computers came on the scene. By 1986, over 90 percent of the school systems in America had (some) personal computers. A study showed that personal computer use resulted in an 80 percent reduction in television viewing. DDJ went online in 1986.

1987 saw the Computer Security Act passed, mandating that government agencies protect the privacy and security of sensitive data. IBM lost five percent market share, about the same number of unit sales Compaq picked up. In 1987, language and operating system implementations were getting more sophisticated. Basic showed some new versions, C some standards efforts, and optimizing compilers and object-oriented programming were getting attention, as was Microsoft's operating system for the future, OS/2.

The Current Phase

In 1988 Apple announced a strategic alliance with DEC and sued Microsoft and Hewlett-Packard over look-and-feel. Microsoft and HP countersued; OS/2 1.1 met its ship date; Steve Jobs's company, NeXT, showed off its machine.

IBM was getting pressure from customers to license its Micro Channel architecture. Clone makers, though, criticized MCA, professed little interest in developing MCA machines, and, led by Compaq, formed a Gang of Nine to develop an alternative in the form of an extended AT architecture called EISA. IBM released two new computers labelled PS/2s but based on the existing AT bus architecture, while several Gang of Nine members admitted that they were hedging their bets and still working on MCA machines. Most observers felt that there was only room for one standard and that the battle would be won and lost on the 386 battleground.

Paul Heckel was awarded a patent for ZoomRacks, his card-in-rack hypertext system, while Xerox got a patent on the icons in its Viewpoint interface, the first patent granted on user interface elements.

The Open Software Foundation was formed to develop a non-AT&T standard for Unix. AT&T and Sun were at work on Open Look, a graphical front end for Unix; Sun licensed Viewpoint icons for Open Look; and IBM began talking with NeXT about developing a competing graphical front end for IBM's Unix, AIX.

In 1989 QuarterDeck won a patent for the multitasking windowing system in Desqview. The first EISA machines arrived, and were high-end machines that didn't exactly throw down a gauntlet to IBM.

In 1990 Windows 3.0 was delivered, pa rumpa pum pum. HP's New Wave desktop environment skipped version 2 and went straight to the magic number 3.0, adding agents (task-spanning macros). Borland announced that the next versions of Turbo Pascal and Turbo C would have object-oriented features, which would result in a huge increase in the number of programmers exposed to OOP.

A number of notebook computers with hand-printing recognition and stylus input were announced or hinted at. Xerox came out with a combination plain-paper fax, copier, printer, and scanner. Apple demonstrated that it wasn't kidding about getting competitive in its pricing by introducing new low-priced computers and lowering prices on some existing machines. PC manufacturers continued to keep the EISA, MCA, and old AT bus architectures alive, while the media conglomerate named MCA told IBM to stop using that acronym, and IBM agreed.

Lotus and Novell talked merger, but changed their minds. Raima entered into joint ventures with several Soviet software companies. Security of sensitive data in the computers of many government agencies was not in compliance with the Computer Security Act of 1987 after three years. IBM picked up market share and spun off its Information Products division (typewriters, printers, keyboards), leading to speculation that other divisions would follow.

And in 1990 Mitch Kapor issued his Software Design Manifesto, his first step in a plan to create an environment in which good software design pays off commercially. He also founded, with Steve Wozniak and John Perry Barlow, the Electronic Frontier Foundation to educate the public about social issues of the information age.

If the revolution is over and we won, then what Mitch Kapor is up to now is more important than any work underway in any research and development lab in the world. The conscious design of software for use by people on the one hand, and the development of a consensus on values for electronic communication on the other, will define the kind of world we will inhabit in the future.

As software designers, you have the greatest opportunity and the greatest responsibility for the shaping of that world, because you are the architects of the future.


Copyright © 1991, Dr. Dobb's Journal