Faux Fear

Dr. Dobb's Journal February 2002

By Michael Swaine

Michael is editor-at-large for DDJ. He can be contacted at mike@swaine.com.

In these timorous times when we open our mail in mask and mittens, FedEx our luggage when we fly, and sacrifice our sanity on the altar of security, it is comforting to know that some fears are phony. Harry Potter will not convert our children to witchcraft. In this column, I'll discuss the following maybe-somewhat-scary subjects in not-so-scary terms: Apple, MacOS X, and developers; the arrival of real nanotechnology; and a book about programming by a New York Times journalist. Just in case those topics are too fraught with fright, I have also included a warm and fuzzy story about a simple country doctor. A doctor who once had a very different kind of profession...

The Book of Lohr

If somebody didn't already buy it for you for Christmas, and if you didn't already buy it for somebody and read it before you wrapped it, and if it hasn't been floating around the office for months, you may want to get your hands on Steve Lohr's Go To: The Story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts — the Programmers Who Created the Software Revolution (Basic Books, 2001; ISBN 0-465-04225-20. Resist the temptation to think that once you've plowed through the title, you've already read most of the book — although 24 words is about a word for every 10 pages of this 250-page book, which, relatively speaking, is longer than some tables of contents. Resist, too, the temptation to brush off this book about programming written by a New York Times journalist with a scornful "What's a Muggle know about casting spells?" Lohr has done his journalistic legwork here, and besides, it's not so much a book about programming as a book about programmers.

Lohr properly starts with Fortran, the language that clarified the idea of just what a programming language should be and made programming a viable career option. It's necessary to remind oneself now and then that programming is a very recent invention. Many of the first real computer programmers are still with us, and still active. The sketch that Lohr presents of legendary Fortran pioneer John Backus at age 76 happily confessing his addiction to his PalmPilot underscores this, and is one of many little nuggets that Lohr unearthed for the book. In another bridging of the old and the new, Lohr comes upon IBM OS/360 legend Fred Brooks (The Mythical Man-Month) in 1993, discussing why UNIX and C and the Mac inspired fan clubs while Cobol and MS-DOS never did. The languages and operating systems that acquire fanatical followings, Brooks says — sounding a little like Steve Jobs — are those "designed to satisfy a designer," rather than designed to satisfy a large set of requirements from different sources. The latter approach, he says, "produce[s] serviceable things, but not great things."

Halloween

Not being technical does get in Lohr's way on occasion, most embarrassingly when he gets the Christmas/Halloween joke wrong. (You know: Why can't programmers tell the difference between Christmas and Halloween? Because 25 Dec = 31 Oct.) And he (more or less harmlessly) misuses the term "declarative language." But the technical errors are minor and few, fewer than the copy editing errors, of which there are more than the inevitable handful that slip through the fingers of even the best editors. Lohr is a stickler for details, though: I now know, because Lohr tracked it down, the correct form of that Alan Kay quote that I've heard dozens of times: "Point of view is worth 80 IQ points." Not "50," not "100," not "context," not "perspective," but "80" and "point of view."

Even if he's not technical, Lohr does get the distinction between Free Software and Open Source software right, and he fearlessly labels Java "the Fortran of the Internet age," which may be an overstatement, but is not an indefensible or an ignorant view. And he strikes a cannily skeptical stance on IBM's embrace of open-source software: "It is sound business strategy for IBM to try to transform the operating system into a profitless commodity, thus undermining two of its leading rivals, Microsoft and Sun Microsystems." A little different from the picture IBM wants to convey of color-within-the lines corporate drones miraculously morphed into a free-spirited band of born-again open-source evangelists exuberantly spray-painting Linux graffiti all over town. Because Lohr is writing for a broad audience, he spends a lot of time sketching the individuals. He brings Charles Simonyi in at the beginning of the book and again later, and one can see why he gives Simonyi, who is both a fascinating individual and a sort of archetypal programmer, the space he does. A much larger book than this could have been written on Lohr's theme, which gives him some freedom in terms of which programmers to profile (sorry; couldn't include everybody). I think he chose well in the computer scientist he uses to frame the story, quoting him in the introduction and again in the afterword — Donald Knuth. Also, because Lohr is writing for a broad audience, he doesn't always tell the story I'd like told. I would have enjoyed more details about John Kemeny's early languages Scalp and DOPE, and some discussion of how these influenced his and Thomas Kurtz's design of Basic. I'd have traded some of the re-told tales of Basic's subsequent history (even though this magazine and I get cited in those tales) for the skinny on those early languages. I'd have liked to see more on Lisp and Forth (there's nothing on Forth) and on the major programming paradigms (but I admit to an extreme proparadigms bias). And I admit that structured programming and object-oriented programming and AI all do get discussed, at a general-audience level. Also covered are programming methodologies, the challenge of bringing programming to the masses, the challenge of complexity, and distributed programming. Not to mention bridge players and chess wizards. Finally, though, it's just a good read.

The Horror of TN2034

A reader who identifies herself as a software developer writes to say that she would welcome the opportunity to put in some development time on the Mac platform, as it seems to her to be an excellent machine. What's stopping her is a perceived lack of support from Apple, particularly in terms of documentation of OS X, as compared to what she gets from Microsoft. And so this developer will, reluctantly, stick with Windows.

It's tempting to say that she should have been trying to write Macintosh applications back in 1984; then she'd appreciate the relative wealth of documentation that Apple provides developers today. But this is an old story. Whatever else you may say about Microsoft, the Beast appreciates independent software developers. Fears them, even. Courts them. "For Microsoft, catering to developers — wooing them, helping them, supplying them with useful tools...is a corporate mission." Microsoft is still Bill Gates's company. Apple is again Steve Jobs's company, and that implies a different relationship with outside programmers. One clue that Apple may be a little clueless regarding issues of importance to third-party developers was Tech Note TN2034, posted on November 25, 2001, and removed on November 28, 2001. The Tech Note purported to offer guidelines for developing MacOS X apps, but the advice it gave would, according to developers who protested the note, make apps user unfriendly. The note also referred repeatedly to a language it called C+. Embarrassing.

Still, I think the DDJ reader may be protesting too much. Lots of developers are producing products for MacOS X, so the documentation can't be that bad. Of course, not every third-party developer can command the kind of support from Apple that the developers of some recently released OS X apps can, apps such as Office v.X, Freehand, Quicken, ViaVoice, Mathematica, and Palm Desktop.

Lost In the Nth Dimension

I have had the chance to play around with several development tools on (and for) the MacOS X platform recently, and I wanted to share some thoughts about a couple of them. (REALbasic, the Mac development tool I've spent a lot of time with recently, may be the subject of a future column.)

GZigZag, the Java implementation of Ted Nelson's ZigZag, now runs on MacOS X in the Terminal Window. This is an example of how people are using MacOS X as a UNIX platform, and an opportunity for Mac developers to explore this fascinating new paradigm in the stability of MacOS X. But I'd better explain what ZigZag is.

ZigZag (http://sourceforge.net/projects/gzigzag/) is a new paradigm that addresses the same issues as a database program, filesystem, or personal information manager, but in a novel way. It throws away most of the central concepts of computer use (no folders, no files, no applications, no windows — or maybe just one window) in favor of a radically different and radically flexible way of arranging and working with information. In one demo presentation of ZigZag, Nelson showed off a family tree and an address book developed using ZigZag, then explained that these were just two views of the same information structure, and finally pointed out that there was no address book program or genealogy program running in the background. ZigZag is a tool for representing arbitrary structures of information, sufficient in itself to implement family trees and address books and accounting spreadsheets and personal information managers and timelines and organizational charts.

Designing structures in ZigZag is a matter of mapping out connections among cells in multidimensional, arbitrarily connected space. Data items are placed in lists, or dimensions, which can be displayed in different views, which are basically boxes and lines. But this structural information also resides in dimensions: There is a dimension of dimensions and a dimension of views, for example. Most of us can only handle about three spatial dimensions, and ZigZag doesn't burden us with more than three dimensions at a time, but it can handle an unlimited number. The way that it allows you to rotate through dimensions is arguably a brilliant solution to a daunting and fundamental data-representation problem; it's also arguably the most vertiginously challenging computer-using experience since Flight Simulator, although ZigZag is dizzying on a whole other level.

Nobody is going to develop any commercial applications with ZigZag anytime soon. In fact, ZigZag doesn't really have the concept of an application (another aspect of computer use that Nelson is only too happy to throw out). Right now, it is a platform for exploring this new way of dealing with information. That's exactly the right first step: If the ZigZag model ever does take off, it will be because enough developers were able to "get it" and were able to see how to use it productively. And that can only happen if there's a sort of playground for trying out ideas in ZigZag space. That's what the current (stable) implementations — of which the MacOS X implementation is just one — offer.

Tales From the Script

I continue to be impressed with AppleScript under OS X. It fairly boggles my (easily boggleable) mind that Apple was able to port its system-level scripting language over to MacOS X so successfully. There are differences between the Classic and OS X versions, of course (the OS X Finder is a completely different application from the OS 9 Finder, and scripts written to control one likely won't work with the other without modification), and there are some things missing from the OS X implementation. But I routinely execute scripts that bridge the OS gap, messaging between Classic apps and OS X apps without a problem. It's an underpromoted and underdocumented tool, in my opinion, but third-party help has arrived for would-be AppleScripters in the form of AppleScript for Applications: A Visual QuickStart Guide, by Ethan Wilde (Peachpit Press, 2002; ISBN 0-201-71613-5).

I like these VQS Guides (I'd better; I'm writing one). Yes, they are beginner books, but they are also cookbooks, and even the best chefs benefit from having easy-to-find recipes for common or obscure preparations. This one gives recipes for scripting the Finder and several common applications.

One company that believes in AppleScript (and in OS X) is Stone Software (http://www.stone.com/), makers of Create, the illustration program. Create has the ability to save a document as an AppleScript script. Then, when you execute the saved script, it rebuilds the document from scratch. What's this good for? It's a wonderful way to learn the AppleScript syntax of the application. Want to know how to work with a certain kind of document with certain features in AppleScript? Build the doc and save it as AppleScript. It would be great if all Mac applications had this capability.

And, of course, AppleScript opens up the underlying UNIX scripting capabilities of MacOS X. Here's UNIX in AppleScript:

tell application "Terminal"

do script with command [command-line data]

end tell

The Living Dead

I really should make an effort not to shoehorn HyperCard into every column, but there are some interesting developments afoot for those who care about this product that Tim Berners-Lee acknowledged as an inspiration in his original proposal for the World Wide Web (http://www.w3.org/History/1989/proposal.html).

For one thing, Dave Winer, inventor of outlining software, is working on a project that would unite the desirable features of HyperCard with the outline-handling features of his classic programs — More, Thinktank, and so on. Frankly, I have no idea what that would look like, but if he does it, it'll be worth a look. Also, Runtime Revolution, a HyperCard replacement that I have mentioned here before, has been released for a variety of platforms. MetaCard, which is both the engine on which Revolution is built and a respectable HyperCard-like product in its own right, is now available for MacOS X. And the HyperCard community is, as I write this, trying once again to get booth space at MacWorld Expo to promote HyperCard and/or pressure Apple to upgrade or open source it or whatever. I'll report on developer developments at that MacWorld show in a future column.

Invasion of the Nano Creatures

Nanotechnology is a huge and complicated puzzle. But more of the pieces are being put together.

I wrote about nano a couple of months ago, but I wanted to mention one significant development since then.

Researchers in Israel have built a real computer entirely from DNA. Input, output, storage, and processing are all done via DNA molecules. For now, this is just a proof of concept, and you'd have to network billions or trillions of such nanocomputers to do any useful computing. But other research has shown that the DNA approach can be brought to bear on general-purpose computing. DNA computing is one of the tracks in nanotech research, a track launched only a decade ago and already showing a lot of promise. Granted, there are a lot of problems to be solved, the possibilities suggested by computers many orders of magnitude more dense than current limits, using orders of magnitude less power, and perhaps not built but grown or self assembled — the possibilities are awesome. And a little scary.

Paradigms Past

Ed Roberts got a little well-deserved recognition in November, 2001. That was when he was inducted into the Georgia Technology Hall of Fame, which is a permanent exhibit of the SciTrek museum in Atlanta. And the Atlanta Constitution article reported the event under just the right lead sentence: "The only thing Ed Roberts ever wanted to be was a physician." That's true, but it's equally true that he somehow got sidetracked into starting a revolution.

First, the Air Force sidetracked the Georgia boy, as did the engineering degree that it just made sense to get while in the service, and then when he got out he somehow found himself in Albuquerque with friends building and selling mail-order remote-control model airplanes. That experience led, by some peculiar logic, to building calculators, and when Texas Instruments knocked the props out from under that industry, an even more obscure logic compelled Roberts and his bankrupt company, MITS, to go into the microcomputer business.

Correction: To invent the microcomputer business. MITS pioneered small, arguably personal, computers with its Altair box, as well as pioneering or inspiring microcomputer retailing, clubs, conferences, system software, and publications.

The Altair was announced in the January 1975 issue of Popular Electronics magazine. Two years later, Ed sold MITS to Pertec and shortly thereafter was back in Georgia, going to medical school. I remember interviewing him in 1983, in the tiny Georgia town where he was then practicing medicine. It may be the same small Georgia town where he is still, at 61, doctoring his friends and neighbors today. It's what he always wanted to do.

But it's nice to see this acknowledgment of what he accomplished while he was sidetracked.

DDJ