Dr. Dobb's Journal January 2002
Mark Twain got it right. "Persistent intrusions of weather," he somewhere wrote, "are bad for both reader and author." Magazine columnists should repress the urge to depress their readers with atmospheric lows or to heap drifting snow upon them in intemperate timeliness. Oh, magazines should be timely wherever possible, but since the barometric pressure and wind velocity at the location and date of writing will rarely relate to the ditto and ditto at the ditto and ditto of reading, those periodicals not actually offering seasonal recipes and planting advice should generally skip the weather. Skipping merrily along, let me then simply say that I write this month's column at that indoorsy time of year when one is happy to sit and surf the Internet, don gas mask and gloves and read the mail, and maybe venture out once in a blue moon to interview a local software developer.
This plan of passive harvesting typically weights my plate with a bounty of topics that merit mention. This month, I get filled in on the virtues of virtual, hum along with Apple's digital hubbub, and trip back in time courtesy of the Wayback Machine and Microsoft's employee directory. I also read my mail, where readers write about the Semantic Web, the new feudalism, and the old radio comedians.
I'm sure you know the old puzzle about how to divide a pie fairly between two people if you don't have any way to measure accurately. If "fairly" means "so that neither feels that they got less than the other," one solution is this: Person A cuts the pie in two pieces, and person B gets to choose a piece first. A's strategy is to cut the pie in as close to equal-sized pieces as possible, and B's obvious strategy is just to take the bigger piece if there is any discernible difference. No matter how A cuts the pie, B can assure herself of at least half the pie. A can assure himself of half by cutting evenly, so that B's choice becomes irrelevant (to A, anyway). The solution extrapolates, with some effort, to any number of pie eaters. Also to carving up an e-commerce market.
The concept of a virtual Internet service provider is not hard to grasp, and a number of nonvirtual ISPs have grasped it firmly and are doing their best to shake some profits out of it in these times when shaky profits are the only kind to be found in Internet businesses. Some are doing better than others.
Here's the idea: A virtual ISP, call it Bob's Computer Shack, looks to the ultimate customer just like any other ISP, but behind the scenes it's all a sham. Bob doesn't need to have a backroom full of hardware to serve his customers because the only Internet service that Bob actually provides to his customers is a monthly bill. The real ISP is the company for whom Bob is a customer, the VISP vendor.
The ideal candidate to be a VISP is a computer store. People buying new computers are typically in the market right then and there for an ISP. (Not to mention the fact that, having just plunked down five hundred or five thousand dollars, they are also an easy touch for "a few dollars more.") How nice for the computer store if it can say, "Need Internet? We provide that for another eighteen bucks a month. Shall I sign you up? We'll waive the usual installation fee because you're such a good customer." Yadda yadda, first-born child, lifetime contract, et cetera. And how nice, too, if the computer store doesn't actually have to do anything but take the order, e-mail the monthly bills, and cash the checks.
Of course, the checks are only the income side of the picture. The VISP vendor gets its pound of flesh, and since it's running the show and doing most of the work and providing all the hardware and perhaps all the support, it takes a big slice of the pie. (Please excuse the grinding of gears on that clutchless shift of metaphors, but I had to get out of the flesh and into the pastry if I had any hope of justifying the inclusion of the pie puzzle.) The VISP business model is all about slicing the pie. How well the pie gets sliced, both for the VISP vendor and for the VISP, is the viability index for VISP vendors.
I recently had an in-depth interview with Todd Grannis, founder of VISP Technologies, right here in Grants Pass, Oregon. Todd, who was developing BBS software and running a successful online service before there was a World Wide Web, may have come up with this VISP idea before anyone else. He's certainly been at it as long as anybody. His company contracts for thousands of dialup numbers nationwide, so his customers, the VISPs, can legitimately claim to offer as many local dialups as Earthlink. He also outsources tech support, so his VISP customers can offer their customers 24/7 tech support at a cost to the VISP of a buck-and-a-half per customer per month.
Todd fully understands that he's in the pie-slicing business. The latest (that is, in beta as I write this, probably released as you read it) version of VISP Technology's software provides rich customer usage information and management tools for the VISP, to use or not, at the VISP's option. Smart VISPs will use all the tools, and monitor double logons and customer usage patterns closely because VISP Technology lets them allocate customers to networks with surgical accuracy, and rewards those companies that successfully manage their customers' usage by granting them significant discounts, tuned to their efficiency. The edge that VISP Technology has is the patented technology that lets the VISP restrict a user to one network or another.
We got deeply into the numbers, and I could see that some of Todd's customers are doing very well indeed. The plan isn't quite equivalent to the pie-slicing puzzle solution, but it's close. Todd's company defines various ways to slice the pie, and the VISP chooses among these slicing options. I saw data for actual VISPs that were selling ISP connections for very reasonable prices and still making large margins with very little time investment. But VISP Technology makes good money from the VISPs whether they are efficient or not.
So who loses in all this pie slicing? Clearly, it's the businesses actually providing the dialup connections. To the extent that Todd's customers, the VISPs, channel their customers to the perfect (for the VISP) dialup, they make the dialup provider's business model just a bit less attractive. If there were a lot of these VISPs and they were all efficient, this might be a problem, but as things stand now, it's not.
In late October, Apple introduced its iPod, a $400 music player that fits in a shirt pocket, downloads music rapidly, holds what a 5-MB harddisk holds (Apple says 1000 songs), and looks really classy. This may or may not be a killer Christmas gift, it may or may not be the slickest music player to date, and it may or may not be too expensive, but the paradigmatic question is: "What does this first product of Apple's Digital Hub strategy tell us about that strategy?" (You may recall that Steve Jobs has said that the Mac's future is to serve as the hub for a digital lifestyle, which for me evokes a vague picture of consumer digital devices all at least part of the time connected to, and somehow enhanced by, a Mac. That, and chrome wheelcovers.)
I asked myself that question before Apple revealed just what its secret new product was, and tried to figure out what features a Digital Hub product ought to have. Taking my speculations as prognostications, I didn't do terribly well.
I'm sold on 802.11b wireless connectivity, and Apple broke open the wireless LAN market with Airport, so I thought that Airport ought to be part of the picture. Maybe there wouldn't be an Airport card in the first Digital Hub product, but at least there ought to be wireless capability. And I expected Apple to leverage its iTools online service, connecting the device to the Web, but in an Apple-branded way. Wrong on Airport, and I should have known better. 802.11b is pretty slow, and it was clear that the device was going to be either a music or a movie device of some kind that's where Apple is putting its money and time, and Steve wasn't about to get distracted with a PDA or something else. Apple also has a stake in the success of Firewire, so I should have been able to figure out that Firewire, not 802.11b, would be the spoke connecting the new device to the Mac hub.
I was right about the iTools connection, as Apple announced a deal to let Mac users download royalty-free music via iTools. No direct iPod connection, but an indirect one via the hub, a Mac. For the Digital Hub paradigm to mean anything, Apple has to make the Mac essential for getting the most out of these devices. Well, Steve did announce a new version of Apple's iTunes software at the same time as the iPod, and claim that iPod "knows all about iTunes." That's the idea, surely: Make devices that will only deliver their real jewels in combination with Apple's computers and software. How much Mac exclusivity Apple can get away with is an open question; when Microsoft does this sort of thing, the Justice Department gets stressed out, but Apple isn't Microsoft and probably won't raise any official hackles.
Where are they now? If they've still got it, they may be doing research for Microsoft. Case in point: Tony Hoare.
Hoare got into programming back in the 1950s. He's the author of the Quicksort algorithm, which is enough to place him in the computing hall of fame, but he also led the team that designed the first Algol 60 compiler implementation, and became one of the leading lights in research into concurrency and provable correctness in software. Now he's doing research for Microsoft. You can read about what he's done, as well as find out who else has "retired" to Microsoft at http://research.microsoft.com/.
On the other hand, if they once worked for Microsoft and no longer do, maybe they're exploring their spiritual sides. Cerise and William Vablais were, respectively, the manager of the Special Campaigns team of the Microsoft Site Builder Network and the manager of University Research Programs at Microsoft. Recently, they helped fund a college of astrology in Seattle. I'm not sure why I find this incongruous; astrology is a more widely accepted belief system than Jedi Knighthood or Mormonism. Many will doubtless find the Microsoft-to-Astrology career vector totally congruous.
On the third hand, if they never worked for Microsoft, there's always the Wayback Machine (http://www.archive.org/). The Wayback Machine won't handle all your queries about the history of technology, its reach extending back only as far as 1996, but it will tell you a lot about that period. It's the new public-access portal to the Internet Archive, Brewster Kahle's project to freeze-dry the entire Web, as it grows and transmogrifies, for posterity. Just after its launch, the Wayback Machine did a very impressive impression of the Web today by choking on too many hits. "Access to the past," the site announced while it caught its breath, "will be available in the future."
Stefan Decker writes to take issue with my deliberately provocative claims in the July "Programming Paradigms" regarding the Semantic Web and artificial intelligence. Decker is a project leader of the OntoAgents project in the DARPA DAML program, is the maintainer of http://www.SemanticWeb.org/, has "been working on Semantic Web Stuff since 1997," is a member of the OIL Steering committee and member of the DARPA DAML joint committee, which defined the latest release of the DAML+OIL ontology language, and is author of the first RDF query and inference service. So he clearly knows more about the Semantic Web than I do. He wrote at some length, and I hope I don't do his ideas a disservice by editing them down and paraphrasing. Regarding my claim that the Semantic Web is an AI project, he says:
The Semantic Web is as much an AI project as a database project, systems project, natural language processing project, information visualization, data structures project, etc. Sure, certain AI techniques are useful [but] inferencing (e.g., by theorem provers) is NOT an important aspect of the Semantic Web.
He acknowledges that theorem proving has some niche Semantic Web applications, but says it's not central. Regarding my stronger claim that the Semantic Web is an AI project that could actually work, he says:
The Semantic Web is not a product, something that works. It is rather a process of connecting different software on the Web to each other, such that one program understands the output of another one, even if that was not intended. This will happen anyway (and had already happened). The goal is to make writing the connecting software as cheap as possible, so that it becomes economical[ly] feasible to let more and more systems talk to each other. This will not happen at one particular point of time. It is a process. But otherwise, yes, I think to develop the technology and to turn the black art of data transformation into an engineering discipline is feasible we already presented some results and are further working on it. So in this sense it will work.
Regarding my strongest claim, that the Semantic Web is an AI project that could result in a Web that we might feel compelled to call intelligent, he says:
If machines are able to make sense out of data that were not intended for these machines, is that intelligent? I think it is a bunch of more or less clever algorithms and data structures.
I'll leave it to you to decide whether Decker has scored any points against me, but I will say that, while I don't challenge anything he says, I don't see anything in it to make me retract anything that I wrote. Of course it can be argued that, even if the Semantic Web can properly be viewed an AI project, I shouldn't say so because AI has such a reputation for overpromising that the label could turn people away. But that's a political, not a factual, argument.
Reader Karl Stengel wasn't sure he bought Jeremy Rifkin's view of the age of access ("Programming Paradigms," DDJ, November 2001), but he was inspired by it to write me a longer e-mail than Stefan Decker. This paragraph from his communiqué really warmed my heart:
One of the reasons we've avoided severe recessions in the last twenty years is consumer willingness to go into debt to buy goods. Economists are always complaining about our low savings rate, yet if we saved a lot more we'd be thrown into a recession. Low wages force people to save or to buy, but they can't do both.
I don't know whether he's right or not, but I'm thrilled to finally get some credit for my efforts in the cause of running consumer debt up to record levels.
Mark Lutton properly, if belatedly, skewers me for misuse of the word "factoid" ("Programming Paradigms," DDJ, March 2001) thus:
Norman Mailer coined "factoid" in his 1973 book Marilyn. Specifically it means a false piece of information fed to a magazine or newspaper for publicity purposes. When you hijack "factoid" to mean "trivial fact," you leave us without a word for a maliciously planted lie.
Finally, this from Mark Knutsen regarding my November "Swaine's Flames" column: "Thanks for a very funny column. You should know that at least some of your younger readers got the Bob and Ray reference." To those younger or otherwise readers who have no idea what Mr. Knutsen is talking about, I merely mention that Bob Elliott and Ray Goulding were the funniest men who ever lived. And invite you to join me again next month for more incoherent ramblings from..."Programming Paradigms." (Theme up briefly and then out.)
DDJ