Other Worlds

Dr. Dobb's Journal October 2001

By Michael Swaine

Michael is editor-at-large for DDJ. He can be contacted at mike@swaine.com.

Why is there anything? And even given the philosophically implausible but difficult-to-ignore fact of there being something — in fact, quite a preposterous and bewildering hodge-podge of something — rather than the perfect simplicity and absolute symmetry of nothing, why this particular sampling of something? Why did we get a universe that contains Flower Power iMacs and Cherry Garcia ice cream? Were they somehow encoded in the cosmic DNA at the moment of the Big Bang? If they weren't, where did they come from? And if they were, why?

I used to worry about things like this until I accepted the many worlds hypothesis. It's popped up repeatedly in roughly the same form among science-fiction writers and physicists and mystics and daydreamers. I imagine that we daydreamers get more simple pleasure out of it than the others, for whom the contemplation of multiple universes is work. The basic idea, which is as deeply as I ever get into it, is that all possible universes exist in a great multiverse that is almost as pure and symmetric as the void, and that we just happen to live in this one universe out of the many possible universes, and that ours is no more strange or asymmetric than any of the others. They are all odd, in the sense of an odd sock in the wash, but in the multiverse, all the socks are paired.

I have a feeling that this is just a way of putting the question out of sight, but it works for me. I've always lived in multiple worlds anyway.

Expo World

Macworld Expo in New York this past summer was apparently such a yawner that I'm glad I opted to stay home. By all reports, the congenitally charismatic Steve Jobs gave a bumbling and lackluster keynote address, with high points that included the promise of modest processor speed increases, even more modest quarterly profits, uninspiring tweaks to the current product lines, rebates and price cuts, and the announcement that the dot-one rev of MacOS X wouldn't be out until September. The new machines would ship with either OS 9.1 or 9.2 as the default OS (though users would be able to turn on OS X if they wanted), underscoring the fact that OS X is not really all there yet, a fact that anyone trying to develop software for it or use it on a network already knows, which is why we were so eagerly awaiting the dot-one release, which we optimistically hoped would fill all the holes, and which it very well might, when it ships.

To be fair, everything that Apple announced at the show seems eminently sensible; it's just not the kind of drama we have come to expect from Apple, and Macworld Expo is traditionally where the company shows off everything it's got. So there are two ways of reading the Macworld nonevent: Either Macworld isn't where it's at any more, or Apple hasn't got anything right now.

Both, I think, are partly true: Apple has begun making announcements at other venues, spreading out its announcements over the course of the year rather than having just two major press events a year, at the East and West Coast Macworlds. And the rumored and expected announcements just didn't come together in time for the show. Reports about the 10.1 release of OS X are good, but it just wasn't ready to release at showtime. And although Apple would love to phase out the last of its CRTs, that decision is at the mercy of flat-panel pricing. Apple is doing everything in its power to win points in the education market, and raising prices on iMacs would send exactly the wrong message. As for the rumors of an Apple PDA, even if, despite Apple's strong denials, such a product is in the pipeline, the current doggish market would not be the best environment to launch it into.

Mac Worlds

Since Apple has chosen to concentrate on niches (like education) more than on overall PC-industry marketshare, it makes sense to think of Apple's customer base more as a confederation of disparate groups. There is just one Macworld (well, actually several) but there are a number of different Mac worlds.

Apple's first world these days is always the world of Apple investors, business reporters, and financial analysts. Steve Jobs held his now-traditional conference call on the state of Apple finances and even made himself available for grilling by CNN's Lou Dobbs (who has returned to financial reportage after his dot-com venture failed to make him rich). In the doldrums of the computer business this year, Apple's financials stirred a slight breeze, and that was enough to keep the money people happy.

Until the lackluster Macworld, that is, and then the stock dropped 17 percent.

The lack of flash from Apple cast a shadow over bright news from a lot of third-party developers. Despite my derogatory reference earlier, it is possible to develop and deploy apps for OS X right now, and over 1000 of them are out there. At the Expo, the third-party developers got to be the stars for a change.

We in the press are always hungry for nonlinear phenomena, nonzero-sum games, the improbable triumph of the underdog. No wonder Apple continues to engage us. It's not just that it's unpredictable; it's that Apple has a habit, especially since the return of Steve Jobs, of changing the rules of the game. Apple is more important than its market share, says Steve Lohr at The New York Times.

The digital hub thing is one of those attempts to redefine the rules of the game. It's presently just one of Apple's niche markets, the home market, and thus one of the Mac worlds, but Apple hopes to turn it into something more. This is a very Jobsian vision, very much in the spirit of the original Apple Computer.

I hate to praise a winner, especially an obnoxious one, but it needs to be repeated occasionally that the vision that made Apple something more than just another computer company has always been Steve Jobs's. No one else ever articulated it as well, no one else made it the priority that he did. Although he is short on technological and business skills, Steve Jobs is always a force to be reckoned with in technology business, and it is because of his vision.

The Lost World

That said, I still wouldn't want to work for the guy.

One Mac world that Apple would like to forget about is the lost world of stackware. HyperCard authors are still out there generating stacks, but their influence and visibility is negligible compared to what it was in the late '80s when HyperCard was new and there were, by my no doubt incomplete count, more magazines dedicated to HyperCard and stackware than to Apple and the Mac generally.

The best news for the die-hard stackheads may just possibly be the release, in July, of Revolution 1.0 from Runtime Revolution, the tiny software development house in Edinburgh, Scotland, whose young founder, Kevin Miller, I met with at Apple's Developer Conference (as recounted here in September). HyperCard authors seem to be getting Kevin's message; in the weeks following its 1.0 release, Revolution was the hottest topic on the HyperCard list that I follow.

Revolution is based on Scott Raney's robust MetaCard engine. MetaCard is a powerful tool, but not easy to use. Revolution puts a new face on it, and adds a lot of functionality as well. If Apple were to release a new version of HyperCard, add the kinds of features that would be required in a 21st-century software product, and make it cross-platform (including UNIX), Revolution would be dead. But since this seems about as likely as a Beatles reunion, Revolution is looking like the best bet for HyperCard developers wanting to move up.

That move will involve some pain, because Revolution is not HyperCard. It is very much in the spirit of HyperCard, and it has native color, powerful animation capabilities, and many desirable features that HyperCard was never given.

To cite one example, Revolution permits multiple backgrounds, or as Revolution refers to them, "groups." In HyperCard's language, HyperTalk, a card can belong to one background; Revolution lets a card belong to multiple groups, and although both HyperTalk and Revolution are more object based than object oriented, it seems to me that this gives Revolution something akin to multiple inheritance.

Kevin is very interested in reaching the education market. Standard licenses start at $349, but K-12 educational licenses start at $2. There's a Starter Kit Edition that's free for unlimited use, but limits the amount and complexity of code. The Standard license and starter kit let you distribute the software they create with Revolution on every supported platform, royalty free. That includes Mac, Windows, Linux, and various flavors of UNIX.

But HyperCard veterans contemplating a Revolutionary move should be prepared for these four things — your existing stacks will break, you'll have to learn new stuff, Revolution is not as stable or bug-free as HyperCard, and it is not (as some versions of HyperCard were) free.

The REAL World

REAL Software has brought out version 3.5 of its REALbasic development environment for the Mac. I've written about REALbasic here before, and I probably will again, since I'm writing a book on it.

One interesting feature of this release is a built-in scripting language. Another interesting thing about REALbasic is its popularity. REALbasic is developing quite a following, and a lot of decent software is being developed with it. I think it may have hit some sort of sweet spot in terms of power and ease of use. It's not for everyone and certainly not for every task, but it is a good tool for creating cross-platform apps. Not as cross platform as Revolution: So far, RB is not a UNIX product.

Joel's Worldview

There are books that you read to learn something new, and there are books that you read to be reinforced in your opinions. I have a hunch that Joel Spolsky's User Interface Design for Programmers (Apress, 2001; ISBN 1-893115-94-1) is one of the latter. Not that there's anything wrong with that.

Joel writes a weblog called "Joel on Software" (or http://joel.editthispage.com/, if you're a web browser), where much of the information in this book originally appeared. One recent essay asserted that all good software takes 10 years to write. So you see, he has opinions.

A couple of side issues about this book: Allen Holub, one-time columnist for this publication, was the technical editor. That's good; for a book on GUI design, it's probably even overkill. And Apress, the book's publisher, is an equity-based technical book publishing house run by programmers. I'm not pushing Apress to any programmers out there considering doing a book, but the business proposition, which trades advances and royalties for a piece of the company, is an interesting one. As for the book itself, it's short and readable. And opinionated. He picks apart commercial products from big companies, showing their UI mistakes. I love that. There's a nice example with Excel in which you click on one background window and another one is selected. Described that way, it sounds obvious that it's the wrong behavior, and I think it is, but it's perfectly appropriate given the UI model that the programmers used. It's just that it's a bad model.

He also shows "the most moronic wizard dialog Microsoft has ever shipped." Not all his bad examples are from Microsoft products. One bad example involves the Macintosh trashcan. No, not the fact that Apple decided to blow the logic of the metaphor all kerflooey by letting you use the trash can to eject a disk. It's the fact that they pushed the metaphor too far. Wouldn't it be nifty if you could tell that there was something in the trash by looking at the icon? So a full trashcan looks like a full trash can. Well, it's not nifty, because that full trashcan encourages all us neat freaks to do something about it. So we compulsively empty the trash. But the whole point of having a trashcan at all was so that you could change your mind about throwing something away. You mean you really wanted that old document? Dig it out of the trash. Only you can't if you always empty the trash as soon as there's something in it. Metaphors should be honored, but not mindlessly. I don't know if I learned a great deal from User Interface Design for Programmers, or if I'd turn to it as a reference when confronted with a user-interface problem, but as somebody who does think about user-interface issues from time to time, I'm glad I read it.

My World (and Welcome to It)

I can't believe, after the tumult finally died down from my insulting the Canadians, that I would be so dumb as to call actress Lucy Lawless an Australian. Now the New Zealanders are on my case. Of course, I provoke all my international incidents in my "Swaine's Flames" column, not here in "Programming Paradigms," where I never make mistakes. But still.

Luckily I had just carved out space for an errata and addenda at Swaine's World (http://www.swaine.com/), so I was able to respond within minutes to the Lawless barrage, and it never reached the proportions of the Angry Canadians mail storm.

Some of my mail is not about national pride. Philip Hankins writes to tell me that my mention of "Harvard's Aiden Computer Laboratory" ought to be "Harvard's Aiken Computer Laboratory." He's right, of course; it was Howard Aiken who talked Thomas J. Watson into implementing his scheme for the Harvard Mark I. I should have known better, but Hankins certainly knows: He says he was a student in Aiken's course in computer design back in 1956. So my claim that I never make mistakes in this column was itself a mistake; but don't write to tell me about it, okay? Brad Noren and Jeffrey Olkin both sent in alternate versions of the hat color problem from my July column. Maybe they'll let me post them at my web site. Jeffrey's version appeared in the January-February 1976 issue of the late, lamented Creative Computing magazine, which hit the stands at about the same time as the first issue of this publication, and about the same time that reader Chuck Guzis got his Altair. "8800 chassis, CPU card, 8K (2 boards) of (miserable) dynamic RAM and serial I/O card," he recalls, confessing "I never used the Basic that came with the system, choosing instead a really neat little 5K Basic that came as a hex dump from Intel (no, Bill Gates did NOT write the first Basic for the 8080!)."

Chuck was soon writing his own Basic, and elected to do a compiled one. "Compilation to native code was not a real option and of dubious value, since a program would consist of CALLs and not much else," he says, but he picked up an idea from a former coworker, Richard Dorrance, one of the implementors of IBM COMTRAN, a predecessor to Cobol. "I was intrigued by his methodology of language development: (1) Think of hypothetical machines to compile and execute the language you're working on. (2) Write simulations of those machines. (3) Write your compiler and run time in the instructions of those machines.

"So that's what I did for Durango — the whole job took three of us four months and we demoed the multitasking systems, complete with ISAM in 1978 at WCCF or NCC (I forget). I do recall that [Bill Gates] and Microsoft were there and were quite upset that our multitasking Basic ran rings around their nonmultitasker, performance-wise. So Pascal bytecodes and the Java JVM are nothing new — they have their roots back in the 1960s at least."

Perhaps it's true that there are no new ideas in this world. But at least there are still some that aren't patented.

DDJ