Dr. Dobb's Journal August 1998
There's a nice piece on C++ versus Java by Andrew Madden in the June 1998 issue of Red Herring magazine. Madden contrasts the philosophies of the languages' inventors, Bjarne Stroustrup and James Gosling. Modesty versus hype. Openness versus proprietariness. Performance versus development time. Those are issues that divide C++ and Java users, and they also appear to be the philosophical issues dividing the inventors.
Is modesty a virtue in a programming language? Stroustrup thinks so. Invisibility, even. "If you know what language you are using," he says, "there is something wrong. You shouldn't be able to tell." The "you" in this case is the user, I guess. Even Bjarne isn't modest enough to suggest that the programmer shouldn't be able to tell the difference between C++ and Java. But he may be mystical enough. Red Herring notes that Stroustrup's thinking was heavily influenced by the Danish philosopher Sören Kierkegaard. Gosling, as we know, was influenced by an oak tree growing outside his window.
I know how Bjarne feels. As I researched this month's column, I was overwhelmed by Kierkegaardian angst. Okay, I admit I've never read Kierkegaard. I've never gotten past the creepy titles. For all I know, the guy may have been as upbeat as a Ritchie Valens tune, but the titles have always filled me with a Danish dread. I'll try not to inflict too much of Danish dread on you this month. Maybe I can steer the column toward a more Ritchie Valens tone by the end.
David A. Patterson is best known as the inventor of RISC. Some years back, at the University of California at Berkeley where he still teaches, he led the design and implementation of RISC I, which was probably the first VLSI Reduced Instruction Set Computer. This work taught the world about RISC and became the foundation of the SPARC architecture. Although RISC and CISC partisans argue about the relative merits of their instruction-set-complexity preferences, there is no doubt that RISC has had a large impact on microprocessor design. Patterson was also a leader of the Redundant Arrays of Inexpensive Disks (RAID) project, another highly influential effort.
Patterson is clearly someone to watch when trying to guess where computer architecture might be headed. This lends particular interest to his recent announcement that he had decided to "declare victory and look for new research challenges."
He found one in personal mobile computing. Patterson is looking forward to a small device that is a combined portable computer, cellular phone, digital camera, and video-game device, probably making heavy use of speech I/O.
Such a device creates new architectural challenges. Real-time performance becomes more important than it is in today's microprocessors. "These programs," he says, "typically operate on vectors of eight-bit or 16-bit samples of audio and visual data and 32-bit floating-point data, not the 64-bit data of today's machines." Then there's the demand for high performance for multimedia and DSP functions, as well as for energy efficiency and area efficient, scalable designs.
Patterson thinks that current microprocessor architectures are not ideal for such devices. He proposes reviving vector architectures.
Vector architectures, Patterson figures, are well suited to the narrower widths and real-time demands of multimedia, and they scale well with an increasing number of transistors for future integrated circuits. Another issue is high-level programmability. Vector architectures have a foundation of compiler research that supports high-level language programming. Conventional DSPs don't. Patterson also argues that the memory interface of vector architectures is faster and more elegant than that of MMX-style instruction set extensions.
Vector architectures would benefit from the technology that Patterson calls "intelligent RAM," or "IRAM." He thinks that IRAM is the key technology for mobile computing. IRAM combines processing and memory on one chip, with benefits in memory latency, memory bandwidth, energy efficiency, and size. Patterson foresees a single gigabit IRAM with an internal memory bandwidth of almost a terabit per second -- a hundred times faster than anything available today.
A lot of design effort today is spent on dealing with the speed mismatch between microprocessors and memory. This would be a dead issue with IRAM.
Patterson and others are digging into IRAM issues at the Berkeley IRAM Project, http://iram.cs.berkeley.edu/. A number of companies, including NeoMagic, Accelerix, MOSAID Technologies, Silicon Magic, and MOSYS, are working on IRAM-related projects. Patterson sees some interesting consequences of a move to vector architectures and IRAM.
Today, we know what companies dominate the microprocessor market, and we know who dominates the memory market. If IRAM, which combines processor and memory into one, becomes a big deal, all bets are off -- it's not at all clear what companies would be the big winners.
In the June 1998 issue of Scientific American, IBM maintains that 100 Percent Pure Java is just common sense, Microsoft claims that Windows is the part of the machine that is human, and Apple identifies with a crash victim. But those are the ads.
The editorial part of the issue contains reports on some topics recently covered in DDJ: Ron Rivest's chaffing and winnowing technique for message security, XML, Isaac Chuang's work on quantum computing, and Bob Bemer's solution to the Year 2000 problem.
The piece on Bemer's Vertex 2000 program said that at SciAm's press time, it hadn't shipped yet, so I visited the BMR software site at http://www.bmrsoftware.com/ to see the current status. As of our press time, Vertex 2000 is in beta testing, which may be good enough for some potential customers.
But is it already too late to do any good? Depends. January 1, 2000, is not the universal crisis date. (According to Bob, the Global Positioning Satellite system, upon which much commerce depends, will run out of zeros in its clock on August 22 of 1999.) But BMR is predicting that it should only take 45 days to get most of the Y2K bugs out with Vertex 2000, because it is, after all, a computerized solution. Now Bob has a new, simple (and grandiose) proposal that will get around a big chunk of the Y2K problem for free. It doesn't help with the problems inherent in manipulating dates and calculating with them, but it does address the date I/O problem.
He has come up with something he calls the "Xchange Day." It's a simple variation on the Julian Day, which unambiguously represents any date as the number of days from a fixed reference day. Bob's Xchange Day is the Julian Day tweaked to fit in the same space allocated to dates in Y2K-problematic databases.
Let's make Xchange Day the gold standard -- the Euro -- of date exchange internationally, Bob says. Let's challenge Microsoft to support Xchange Day in its operating-system software. Let's invite Sun to post free Xchange Day applets to the Web.
You gotta admire his enthusiasm. Read all about Xchange Days at his web site.
I went to the Apple Worldwide Developers' Conference in May. Usually, I sample the mood of the developers at the conference, but this year, I decided that several factors made such sampling unreliable. Let's face it: The only developers left at these things are those fanatics who have stayed with Apple for years after it no longer made economic sense to do so. They stayed against all logic and self-interest. These people are deeply into justification, denial, and cognitive dissonance. And during the keynote, they came under the influence of the Steven P. Jobs Reality Distortion Field.
So when Jobs announced that Apple had yet another operating-system strategy, they didn't charge the stage with shotguns and pitchforks as any normal enraged crowd would.
And strangely enough, it looks like this time Apple may have got it right. I'm talking about the OS strategy, although there is evidence that on other fronts Apple has got it right.
The numbers are all positive. The company has had two profitable quarters in a row. Its cash on hand has increased in the past two quarters. The stock has risen steadily from 13 in January to 30 in May. And market share for Apple products is increasing.
The product line makes sense. A year ago, Apple had over a dozen Mac models, it was hard to tell why you'd buy one rather than another, and none was blowing anybody away. Today, the line consists of three models (and a fourth to come early next year), with customizing variations. Anybody can understand the line -- there's a pro desktop machine and a consumer desktop machine (the trippy translucent blue-and-white iMac), a pro portable and a consumer portable (the MacOS-based replacement for the eMate, due out next year). And the machines are all hot. BYTE benchmark results confirm that in each category, Apple is producing top-performing machines. That may or may not be the best strategy (although it's what you'd expect from perfectionist Jobs), but for the moment it is working.
The OS strategy makes sense, too, although it probably would have made even more sense a year ago. Or 11 years ago. And it's a jolt for committed Rhapsody developers.
The first sentence Steve Jobs uttered about Rhapsody at WWDC was in the past tense. Jobs left no doubt that the Rhapsody strategy was history. Apple would instead be directing its system software efforts toward the MacOS, improving OS 8 and working toward a major release, OS X (ten), that will represent the convergence of the Rhapsody core and the Mac OS UI. OS X, due out next fall, will include preemptive multitasking, memory protection, and dynamic resource allocation.
The Rhapsody strategy had been awfully complicated. Rhapsody would run on PPC or Intel platforms. It would contain a Yellow Box, basically the OpenStep environment from NeXT, and a Blue Box, a Mac-within-Rhapsody environment where Mac apps would run. Except that on Intel, the Blue Box would not be available. Apps written for the Mac would run in the Blue Box on PPC machines, and Yellow Box apps would run on Rhapsody for PowerPC, Rhapsody for Intel, the MacOS, and Windows 95/98/NT. There were more complications.
MacOS X looks simpler, on the surface. It's the strategy that most people expected when Apple bought the NeXT OS over a year ago. The good parts of the MacOS will be salvaged, and the creaky substructure will be completely replaced by NeXT technology. The new API, made up of 6000 of the 8000 existing MacOS system calls, plus a few new ones, is called Carbon. "All future lifeforms," Jobs said, "will be based on it."
Carbon is the core set of APIs for applications that can be deployed on OS X or OS 8. Carbon will let app developers take advantage of the preemptive multitasking, memory protection, and dynamic resource allocation in MacOS X while maintaining source code compatibility with MacOS 8 apps. It is a work in progress and Apple is soliciting input on it at carbon@apple .com. But one unshakable commitment is that apps written with the Carbon APIs will run on both OS X and OS 8.
So what has happened to the boxes?
One developer said it looked like Apple had replaced the Blue Box with a black box.
Or is Apple promising a transparent Blue Box?
Just kidding. Apple is indeed promising a transparent Blue Box in OS X, so old unCarbonized apps and 68K apps will run transparently. MacOS X will use Rhapsody's Mach microkernel, a Mac UI, the Carbon APIs, the transparent Blue Box, the Yellow Box, and a BSD/POSIX layer. This will let users run existing MacOS apps, Carbonized Mac apps, BSD/POSIX apps, and Yellow Box apps directly in the OS. Another interesting wrinkle is that there will be no ROM in MacOS X. But the new OS strategy isn't as simple as it looks, or as different from the old one. The Rhapsody operating system still exists, even if the Rhapsody strategy is finished. Developer Release 2 for PowerPC was handed out at the show and DR2 for Intel and Yellow Box for Windows shortly thereafter. The 1.0 releases are scheduled for the third quarter of this year, with the target market being servers and development machines. And Rhapsody will be supported beyond the release of OS X, even though Rhapsody and the MacOS will "converge" at some future date.
The real issue is the Yellow Box, and the real real issue is Yellow Box for Intel.
People who were developing for Rhapsody were doing so because of the possibility of writing apps that would run on all Apple computers and on all Wintel boxes. Jobs left the Yellow Box strategy unclear, but other Apple personnel reassured everyone that Yellow Box for Intel is still alive. Yellow Box, they insist, is the key to cross-platform deployment. And this is interesting: Despite all the hype we're going to be hearing for Carbon and OS X, Apple recommends Yellow Box -- not Carbon -- for new application development.
By the way, the question arises, why OS X rather than OS 10? Hadda be marketing. OS X carries a subliminal message that OS 10 doesn't. Sex sells, even if it's just two-thirds of the word "sex."
I exchanged e-mail recently with reader Christopher Miller. In the course of our conversation, I learned that Chris had spent a big chunk of his C++ programming earnings on a sailboat, which he named "Seaplusplus." Apparently, the Coast Guard only allows alphanumeric identifiers in their namespaces. Chris sails out of San Diego and Ensenada. I asked if Seaplusplus had ever given him any trouble, and he told me the following story:
While I was on a 400 mile trip up the California coast, my self-steering electronics got a little saline seepage in the right components and developed a mind of its own. Unpredictably, the helm would get completely locked up in a rapid port turn with no way to manually override it from the cockpit.
This is a dangerous situation at 3 am in 35+ knot winds and 10-foot seas, especially when really, really seasick. From time to time during the trip, my crew would hear me yelling from the cockpit, "Reboot the effing boat!" And that's exactly what they did.
I'm no sailor, and I'm guessing that not too many C++ programmers have had precisely Chris's experience, but I'm equally sure that it will ring a metaphorical bell for some.
DDJ