PROGRAMMING PARADIGMS

Fear and Loathing in the Valley of the Silicon Dolls

Michael Swaine

The homeless woman turned out of the alley in front of me as I walked along the sidewalk by Mr. Goody's in downtown Santa Cruz. If you ask me how I knew she was homeless, I have to admit that I didn't. There was an air about her, though, of someone down on her luck. The stooped posture, the unwillingness to meet the eyes of passers-by. The big tip-off, of course, was the shopping cart she was pushing, with its tattered garbage bag tied carefully shut, protecting I could only guess what meager belongings.

There but for fortune and Moore's Law, I thought, go I; and I wondered, not for the first time, at this strange passivity that has seized the American populous. What does it say about us that so many are willing to accept being homeless, or for that matter, that so many of the rest of us are willing to accept the homelessness of others? Home ownership is becoming a luxury that only the rich can afford, the gap between the rich and poor is growing, and average workers are less well-off than their parents. Yet Americans act as helpless as Russian peasants mystified at the concept of democracy, clamoring for "leaders" to fix things, utterly unaware that it really is we who are in charge.

Although there was a chill in the air, the woman was probably warm enough in a bright white sweatshirt, across the back of which was emblazoned the reassuring news that "Borland fights for programmers' rights."

There are two or three good punchlines that this story could support, but I haven't the heart to deliver them. The best I can do is wistfully observe that this is just another example of a proud emblem of Borland International getting recycled and ending up where you'd least expect it. Sort of like Gene Wang.

Silicon Roadkill

The Gene Wang saga slipped your mind? Never fear, Michael Hyman will remind you. In PC Roadkill: Twisted Tales from Silicon Valley (IDG Books, 1995), Hyman dishes up the dirt on that and other scandals, lawsuits, and dirty tricks that have made life in the personal computer industry just a little more interesting.

He describes the big look-and-feel lawsuits, and he doesn't miss that apt Guy Kawasaki observation from 1992. Xerox had sued Apple over the look and feel of the Macintosh user interface, only to lose because the statute of limitations had run out. Kawasaki's comment: "Xerox can't even sue you on time." Hyman also sorts out who sued whom in the spreadsheet world, a particularly litigious product category. At one time or another, all of the following have been somehow involved in spreadsheet lawsuits: Lotus, Personal Software, Borland, Paperback Software, Santa Cruz Operation, WordTech, Mosaic, Novell, and Borland's insurance carriers.

Of course Hyman reports on the big ones: the DOJ investigation of Microsoft and the FTC investigation of IBM. The Microsoft case may not be over, and Hyman reports that the 1956 decision against IBM may not be the final word in that wrangle, either.

There's a lot more in the book than stories of litigation. Hyman has put together a grab bag of untold stories and stories that some folks wish would remain untold. Code names and Easter eggs. Classic office pranks and t-shirt messages. Product marketing blunders and hiring and firing stories and industry history. It's a fun book, and he's at work now on the sequel.

Oh, the Gene Wang story. That's not the one in which Wang Labs, after filing for bankruptcy protection, sues Microsoft over technology in OLE that it claims infringes Wang patents, gets 90 million dollars out of Microsoft, and gets back on its feet, sort of. No, it's the one in which a Borland executive exchanges e-mail with the president of Symantec, quits to work for Symantec, has his e-mail read by Borland, is accused of trade-secret theft, and is threatened with jail time. This one also is still unresolved.

Gene Wang is scarcely the only ex-Borlander to end up in the camp of one enemy or another. Michael Hyman himself once was the business unit manager for the languages group at Borland and now works for "a large software company in the Northwest." That grrmphing sound you hear is Frank Borland turning over in his grave.

Paradigms Past

Hyman doesn't deal with every legal case in computing history in his coverage of industry litigation. In fact, he skips what was, in a purely historical sense, perhaps the most significant and certainly one of the most protracted and contentious cases: the legal battle over credit for the invention of the electronic digital computer. (He does, however, give full credit to the winner of that battle, something that anecdotal computer historians sometimes fail to do even today.)

This year is a good time to reflect on that battle and its outcome. For one thing, several of the key figures died last year, so the events and issues truly are history now. For the record, the computing pioneers who died last year, all of whom fully deserve to be remembered as inventors of the technology of electronic digital computers, are: J. Presper Eckert (June 3, 1995), John V. Atanasoff (June 15, 1995), and Konrad Zuse (December 18, 1995).

There's another reason why 1996 is particularly appropriate for a review of this bit of history. On February 14, the University of Pittsburgh began an 18-month celebration of the 50th anniversary of ENIAC. It will be interesting to see how delicately the participants tread as they walk around the topic of the invention of the automatic digital computer, a breakthrough once claimed by J. Presper Eckert and John Mauchly, the inventors of ENIAC.

In the 1940s, the electronic digital computer was struggling to come into existence through various avenues: Eckert and Mauchly, John Atanasoff, Howard Aiken at Harvard, several researchers in England, and Konrad Zuse in Germany. All were pursuing work that could have led, in fact did lead in each case, to the design and building of at least a prototype electronic digital computer. But it was Eckert and Mauchly who first proved that there was peacetime use for the things, and money to be made from building them. And it was Eckert and Mauchly who for years got most of the credit, until the issue was brought to court. Ironically, it was a claim by Sperry Rand, the holder of the patent for Eckert and Mauchly's work, that caused Eckert and Mauchly to be stripped of the title of inventors of the automatic digital computer.

From Judge Earl R. Larson's decision, on October 19, 1973:

The subject matter of one or more claims of the ENIAC was derived from Atanasoff, and the invention claimed in the ENIAC was derived from Atanasoff. SR and ISD are bound by their representation in support of the counterclaim herein that the invention claimed in the ENIAC patent is broadly "the invention of the automatic digital computer." Eckert and Mauchly did not themselves first invent the electronic digital computer, but instead derived that subject matter from one Dr. John Vincent Atanasoff.

Note the "bound by their representation" bit. The case was really about something else. Honeywell had sued Sperry Rand Corp. (SR above) and Illinois Scientific Developments (ISD), claiming that they were attempting to enforce a bogus patent. SR, in its counter argument, claimed that the invention covered in the ENIAC patent was nothing less than the invention of the automatic digital computer. I don't know if that claim was essential to their defense of their patent, but it certainly raised the stakes.

Then things took an unexpected turn. It emerged that:

    1. Although he had done related work, Mauchly had developed no plans for an automatic digital computer before meeting Atanasoff in Ames, Iowa;

    2. Atanasoff had by that time developed plans that any competent engineer could use to build an automatic digital computer; and

    3. Atanasoff had shown those plans to Mauchly over the course of several days, allowed Mauchly to stay at his house, and let Mauchly take the plans home with him.

The judge had no trouble concluding that the developments covered in the ENIAC patent were derived from Atanasoff's work. And Sperry Rand had boldly claimed that those developments were the whole shebang: the invention of the automatic digital computer.

The Real Inventor

As a result, official computer history now gives that title to John Atanasoff. Except, that's not the whole story....

As hairy and protracted as the legal battle was, and as clear and decisive as the decision apparently was, the whole thing was really only a parochial squabble. History (or some historians' histories) really only credits Atanasoff and graduate student Clifford Berry with creating the first functioning prototype electronic digital computer in the United States. Worldwide, the credit goes to Konrad Zuse. Never heard of him? You're not alone. Zuse was pretty much unknown outside Germany while he was developing the computer and remained largely unknown in the United States until recent years. He did, however, unlike Atanasoff, get patents, make money, start a business employing 1000 people, and retire unambiguously honored in his own country.

And Atanasoff? Although he was late to get the credit he felt he deserved and never profited significantly from his invention, his story is not really a tragic one. He did not die either homeless or forgotten. His story, including the long drive in the night he took to clear his head and the roadhouse where he stopped and designed the electronic digital computer on paper napkins (or was it the back of the menu?), is a romantic adventure. Howard Rheingold, in Tools for Thought (Simon & Schuster, 1985; sadly, out of print), called him "the last of the lone inventors in the field of computation."

Meanwhile, back in present-day Iowa, Gary Sleege and John Erickson are building a full-scale replica of the Atanasoff-Berry Computer. It's scheduled to go online in August of this year. I'll keep you posted.

Now, does anyone remember the first successfully marketed toy digital computing device? The answer is at the end of the column.

AI Fights Back with Humor

Two years ago I reported on the efforts of the artificial intelligence community to get some respect from the public in general and from research funding sources in particular. The phrase "artificial intelligence" had been overused by hype artists, and using it in a research proposal or bragging that your company used AI technology was regarded, at best, as empty rhetoric, and at worst, as bad science. But after a long AI winter of disrespect, it looked like some buds of acceptance were peeking through the snow.

That perspective lends some flavor to an exchange in the Winter 1995 issue of Artificial Intelligence (Volume 16, Number 4), the journal of the American Association of Artificial Intelligence.

It seems that Patrick J. Hayes and Kenneth M. Ford presented some, er, unofficial awards at the last International Joint Conference on Artificial Intelligence (IJCAI): the Simon Newcomb Awards, named for a distinguished astronomer whom they characterize as having been "hilariously wrong about artificial flight." Newcomb produced important tables for the motions of the moon and the planets and worked with Michelson in determining the velocity of light. He also wrote a number of popular articles asserting, in the strongest terms, the impossibility of heavier-than-air flight, some of them written after the Wright brothers flew at Kitty Hawk. The Simon Newcomb Awards are given out to distinguished philosophers or scientists who are hilariously wrong about artificial intelligence. According to Hayes and Ford, the kind of wrong arguments they are looking for are "those that a graduate student in computer science might find hilarious."

Several readers of Artificial Intelligence thought that the awards were unwise, that ridicule is not a proper critical method, blah blah blah. No sense of humor, obviously. One reader sniffed that he didn't recall being taught the method of ridicule in his graduate experimental methods courses. No doubt he's right, but the reason is that ridicule comes naturally to graduate students. If graduate students in computer science are not ridiculing their professors and anyone else who dares to offer an opinion about their field, I fear for the future of computer science.

Frankly, I'm glad to see the AI community responding in kind to its most vociferous detractors, some of whom are more than willing to treat the goals of AI researchers with ridicule. And after all, the granting of the awards was not positioned as an AAAI-sanctioned event, but merely as a whimsical presentation of some AAAI members at a buffet. Oh, the recipient of the Simon Newcomb Award at that buffet was Roger Penrose, author of The Emperor's New Mind, which I must admit to having liked, with some (significant) reservations.

At that IJCAI meeting there was another robotics contest. The participants are getting pretty adept at picking up trash and throwing it into wastebaskets. The emergent theme of the conference was Sifting Through the Trash, er, I mean, Data Mining. A number of presentations dealt with the growing problem of databases too large to search manually and the need for tools for intelligent search. One such tool is described in that issue of Artificial Intelligence: a tool for sifting through records of financial transactions to find possible money laundering schemes. The cynic might wonder if the most likely customers for such a program are not, themselves, the biggest money launderers, but we don't think like that, do we?

The Swiss Army Chainsaw

"We will encourage you to develop the three great virtues of a programmer: laziness, impatience and hubris."

--Larry Wall, in Programming Perl

Perhaps while we're sifting through the electronic trash, we'll use a tool called a "Pathologically Eclectic Rubbish Lister." Of course, some think that Perl stands for "Practical Extraction and Report Language." Perl is an interpreted language developed by Larry Wall at NASA and distributed over USENET. It's not AI. According to that authoritative source, the Jargon File, it "superficially resembles awk, but is much hairier. UNIX sysadmins, who are almost always incorrigible hackers, increasingly consider it one of the languages of choice. Perl has been described, in a parody of a famous remark about lex, as the 'Swiss-Army chainsaw' of UNIX programming."

I've been describing somewhat obscure little languages in this column for a few months now, and I have to admit that Perl is not nearly as obscure as those described thus far. It also is not a new or different paradigm-- explicitly not, according to its creator, who, in a refreshingly agnostic passage, says: "It has been stated that a language is not worth knowing unless it teaches you to think differently. Perl is the exception to that rule... because much of Perl is derived in spirit from other portions of Unix." Okay, it's not novel. It's UNIX-like and C-like and utterly derivative. Nevertheless, in the space I have left this month, there's just enough room for a peek at the Perl paradigm, so let's peek.

Free and Easy

Perl was intended as a data-reduction language. The low-level, non-AI analog of data mining. It has come to be widely used for Internet programming. One reason might be the dataflow tracing mechanism that determines which data may be derived from insecure sources. That alone makes it a Webmeister's dream.

Wall, a recipient of one of this year's Dr. Dobb's Journal Excellence in Programming awards (see page 16), began by trying to combine what he liked about C with what he liked about UNIX shell programs. He also decided it would be good to have all the capabilities of awk and sed, so he lumped them in. Perl is a capability superset of awk and sed. And he made it an interpreted script language for speed of development, but built in such strong pattern-matching and text-manipulation features that it can often outperform C programs in these areas. Webmeisters also appreciate that it's easy to write short programs in Perl that respond to text messages with specific actions.

Perl is free and available for most platforms. The distribution comes with a bunch of libraries and there are translators that turn awk and sed programs into Perl programs, so your "legacy" awk and sed code can make the transition to Perl.

But the most intriguing thing about Perl is that it has caught the eye of Cygnus Support. Cygnus Support does for-pay support for free software, especially gcc, the C++ compiler written by free-software champion Richard Stallman. If Cygnus Support--or another company with the same vision--supports more free software products, it could make a large amount of excellent, but not commercial-grade, academically developed software available in a practical way to the world at large. It may even mark a shift in how people make money from software. And that would be good.

The first successfully marketed toy digital computing device, at least according to some sources, was Geniac, developed by Edmund C. Berkeley around 1955-56.