SUPER DISTRIBUTION AND ELECTRONIC OBJECTS

What if there is a silver bullet...and the competition gets it first?

Brad Cox

Brad is the author of Object-oriented Programming: An Evolutionary Approach. He can be reached at The Program on Social and Organizational Learning, George Mason University, Fairfax, VA 22030 or at bradcox@infoage.com. A version of this article first appeared in the Journal of Object-oriented Programming (June, 1992), and is reprinted with the permission of SIGS Publications, 588 Broadway, New York, NY 10012.


Few programmers could develop a compiler, word processor, or spreadsheet to compete in today's crowded software market. The cost and complexity of modern-day applications far exceed the financial and intellectual capacity of even the rarest of individuals. Even large-granularity subcomponents like window systems, persistent-object databases, and communication facilities can be larger than most individuals can handle. But most of us could provide smaller (so-called "reusable") software components that others could assemble into larger objects, components as small as stacks and queues.

So why don't we? Why do we drudge away our lives in companies with the financial, technical, and marketing muscle to build the huge objects we call "applications?" Why don't we start software companies (like Intel) to invent, build, test, document, and market small-granularity objects for other companies to buy? Think of the reduction in auto-emission pollution if more of us stayed home to build small-granularity components for sale! Think of not having to get along with the boss!

Object-oriented programming technologies have brought us tantalizingly close to making this dream technically, if not economically, feasible. Subroutines have long been able to encapsulate functionality into modules others can use without needing to look inside, just as with Intel's silicon components. Object-oriented programming languages have extended our ability to encapsulate functionality within "software-ICs" that can support higher-level objects than subroutines ever could. Such languages have already made the use of prefabricated data-structure and graphical-user-interface classes a viable alternative to fabricating cut-to-fit components for each application. All this is technically feasible already, although the software industrial revolution has hardly begun.

Yet these technical advances have not really changed the way we organize to build software--they've only provided better tools for building software just as before. The prefabricated small components of today are not bought and sold as assets in their own right. They are bundled (given away) inside something larger. Sometimes they are bundled to inflate the value (and price!) of some cheap commodity item, as in Apple's ROM software, which turns a $50 CPU chip into a $5000 Macintosh computer. Sometimes they play the same role with respect to software objects, as in the libraries that come with object-oriented compilers.

There is no way of marketing the small, active objects that we call "reusable software components," at least not today. The same is true of the passive objects we call "data." For example, nearly 50 percent of the bulk waste in our landfills is newspapers and magazines. Nearly half of our bulk-waste problem would be eliminated if we could break the habit of fondling the macerated remains of some forest critter's home as we drink our morning coffee. But this is far more than a bad habit from the viewpoint of newspaper publishers. If they distributed news electronically, how would they charge for their labor?

Paper-based information distribution makes certain kinds of information unavailable, even when the information is easily obtainable. For example, I hate price-comparison shopping and would gladly pay for high-quality information about where to buy groceries and gasoline cheaply within driving distance of my home. This information is avidly collected by various silver-haired ladies in my community, but solely for their own use. There is no incentive for them to electronically distribute their expertise to customers like myself.

What if entrepreneurs could market electronic information objects for other people to buy? Couldn't geographically specialized, but broadly relevant objects like my gasoline-price example be the "killer apps" that the hardware vendor are so desperately seeking? Think of what it could mean to today's saturated market if everyone who buys gasoline and groceries bought a computer simply to benefit from Aunt Nellie's coupon-clipping acumen.

Information-age Economics

These questions outline the fundamental obstacle of the manufacturing-to-information age transition. While we're adept at selling tangible goods such as Twinkies, automobiles, and newspapers, we've never developed a commercially robust way of buying and selling easily copied, intangible goods like electronic data and software.

Of course, there are more obstacles to building a robust market in electronic objects than I could ever cover here. Many are technological deficiencies that could be easily corrected, such as the lack of suitably diverse encapsulation and binding mechanisms in today's object-oriented programming languages, insufficient telecommunications band-width and reliability, and the dearth of capable browsers, repositories, and software-classification schemes.

The biggest obstacle is that electronic objects can be copied so easily that there is no way to collect revenue the way Intel does, by exacting a fee each time another copy of a silicon object is needed. More than any other reason, this is why nobody would ever quit their day job to build small-granularity software components for a living.

A striking vestige of manufacturing-age thinking is the still-dominant practice of charging for information-age goods like software by the copy. Since electronic goods can be easily copied by every consumer, the producers must inhibit copying with such abominations as shrinkwrap license agreements and copy-protection dongles. Since these are not reliable and are increasingly rejected by software consumers, the Software Publishers Association and Business Software Alliance have started using handcuffs and jail sentences as copy-protection technologies that actually do work, even for information-age products like software.

The lack of robust information-age incentives explains why so many corporate reuse-library initiatives have collapsed under a hail of user complaints. "Poorly documented. Poorly tested. Too hard to find what I need. Does not address my specific requirements." Except for the often rumored "Not invented here" syndrome, the problem exists only occasionally on the demand side. The big problems are on the supply side. There are no robust incentives to encourage producers to provide minutely specialized, tested, documented, and (dare I hope?) guaranteed components that quality-conscious engineers might pay good money to buy. As long as these "repositories" are waste-disposal dumps where we throw poorly tested and undocumented trash for garbage pickers to "reuse," quality-conscious engineers will rightly insist, "Not in my backyard!"

Paying for software by the copy (or "reusing" it for free) is so widespread today that it may seem like the only option. But think of it in object-oriented terms. Where is it written that we should pay for an object's instance variables (data) according to usage (in the form of network-access charges), yet pay for methods (software) by the copy? Shouldn't we also consider incentive structures that could motivate people to buy and sell electronic objects, in which the historical distinction between program and data are altogether hidden from view?

Superdistribution

Let's consider a different approach that might work for any form of computer-based information, an approach based on the following observation. Software objects differ from tangible objects in being fundamentally unable to monitor their copying but trivially able to monitor their use. For example, it is easy to make software count how many times it has been invoked, but hard to make it count how many times it has been copied. So why not build an information-age market economy around this difference between manufacturing-age and information-age goods?

If revenue collection were based on monitoring the use of software inside a computer, vendors could dispense with copy protection altogether. They could distribute electronic objects for free in expectation of a usage-based revenue stream.

Legal precedents for this approach already exist. The distinction between copyright (the right to copy or distribute) and useright (the right to "perform," or to use a copy once obtained) is made in existing copyright laws. These distinctions were stringently tested in court a century ago as the music publishers came to terms with broadcast technologies such as radio and TV.

When we buy a record, we acquire ownership of a physical copy (copyright). We also acquire a severely limited use-right that only allows us to use the music for personal enjoyment. Conversely, large television and radio companies often have the very same records thrust upon them by the publishers for free. But they pay substantial fees to acquire the useright that allows them to play the music on the air. The fees are administered by the American Society of Composers, Authors and Publishers (ASCAP) and Broadcast Musicians Institute (BMI) by monitoring how often each record is broadcast and to how large a listening audience.

A Japanese industry-wide consortium, Japanese Electronics Industrial Development Association (JEIDA) is developing an analogous approach for software. Each computer is thought of as a station that broadcasts not the software itself, but the use of the software, to an audience of a single "listener."

The approach, which originated with Ryoichi Mori, is called superdistribution because, like superconductivity, it allows information-age goods to flow freely, without the resistance of copy protection and piracy. Its premise is that copy protection is exactly the wrong idea for intangible, easily copied goods such as software. Superdistribution leverages ease of copying by encouraging such goods to be freely distributed and freely acquired via whatever distribution mechanism you please. Users are actively encouraged to acquire superdistribution software from networks, to give it away to their friends, or even send it as junk mail to people they've never met. Broadcast my software from satellites if you want. (Please!)

This generosity is possible because the software is really "meterware." It has strings attached that make revenue collection independent of distribution. The software contains embedded instructions that make it useless except on machines equipped for this new kind of revenue collection.

The computers that can run super-distribution software are otherwise quite ordinary. In particular, they will run ordinary pay-by-copy software just fine. They just have additional capabilities that only superdistribution software uses. In JEIDA's current prototype, these services are provided by a silicon chip that plugs into a Macintosh coprocessor slot.

Electronic objects (not just applications, but active and/or passive objects of every granularity) intended for super-distribution invoke this hardware to ensure that the revenue collection hardware is present, that prior-usage reports have been uploaded, and that prior-usage fees have been paid.

The hardware is not complicated (the main complexities being tamper-proofing, not base functionality). It merely provides several instructions that must be present before superdistribution software can run. The instructions count how many times they have been invoked by the software, storing these usage counts temporarily in a tamper-proof persistent RAM. Periodically (say monthly) this usage information is uploaded to an administrative organization for billing, using public-key encryption technology to discourage tampering and to protect the secrecy of this information.

The end user gets a monthly bill for their usage of each top-level component. Their payments are credited to each component's owner in proportion to the component's usage. These accounts are then debited according to each application's usage of any subcomponents. These are credited to the subcomponent owners, again in proportion to usage. In other words, the end user's payments are recursively distributed through the producer-consumer hierarchy. The distribution is governed by usage-metering information collected from each end user's machine, plus usage-pricing data provided to the administrative organization by each component vendor.

Since communication is infrequent and involves only a small amount of metering information, the communication channel could be as simple as a modem that autodials a hardwired 800 number each month. Many other solutions are viable, such as flash cards or even floppy disks mailed back and forth each month.

A Revolutionary Approach

Whereas software's ease of replication is a liability today (by disincentivizing those who would provide it), superdistribution turns this liability into an asset (by allowing software to be distributed for free). Whereas software vendors must spend heavily to overcome software's invisibility, superdistribution thrusts software out into the world to serve as its own advertisement. Whereas the PC revolution isolates individuals inside a stand-alone PC, superdistribution establishes a cooperative/competitive community around an information-age market economy.

Of course, there are many obstacles to this ever really happening. A big one is the information-privacy issues raised by usage monitors in every computer from video games to workstations to mainframes. Although we are accustomed to usage monitoring for electricity, telephone, gas, water, and electronic data services, information privacy is an explosive political issue. Superdistribution could easily be legislated into oblivion out of the fear that the usage information would be used for purposes other than billing.

A second obstacle is the problem of adding usage-monitoring hardware to a critical number of computers. This is where today's computing establishment could be gravely exposed to those less inclined to maintain the status quo.

It is significant that superdistribution was not developed by the American computer est blishment, which presently controls 70 percent of the world software market. It was developed by JEIDA, an industry-wide consortium of Japanese computer manufacturers.

The Japanese are clearly capable of building world-class computers. Suppose that they were to simply build superdistribution capabilities into every one of them, not as an extra-price option but as a ubiquitous capability of every computer they build? What if the pair of superdistribution metering instructions were built into every next-generation CPU chip, much as ADD and JSR instructions are built in today? Think about the benefits I've discussed in this article and then ask: Whose computers would you buy? Whose computers would Aunt Nellie and her friends buy? What if superdistribution really is a silver bullet for the information-age issues that I've raised in this article? And what if the competition builds it first?

References

Cox, Brad J. Object-oriented Programming: An Evolutionary Approach. Reading, MA: Addison-Wesley, 1986.

Cox, Brad J. Object Technologies: A Revolutionary Approach. Reading, MA Addison-Wesley; available late 1992.

Cox, Brad J. "Planning the Software Industrial Revolution." IEEE Software (November, 1990).

Cox, Brad J. "There is a Silver Bullet." BYTE (October, 1990).

Mori, Ryoichi and Masaji Kawahara. "Superdistribution: An Overview and the Current Status." ISEC 89-44.

Mori, Ryoichi and Masaji Kawahara. "Superdistribution: The Concept and the Architecture." The Transactions of the IEICE, vol E 73 (July, 1990).


Copyright © 1992, Dr. Dobb's Journal