In case you thought there was a lull in the long-standing OS wars, think again. The ongoing conflict between the computer industry's major players has simply moved over to a different front. The new theater of operations is that of interapplication object models, loosely known as "component objects."
This, as you may know, is a software technology that fosters distributed computing with objects--allowing the user of an application program to incorporate and edit document components created by other applications, some of which may reside on remote machines. This approach, which combines system-level software with application-level protocols, takes the classic object-computing paradigm and moves it away from the realm of a single programming language or application, into the wide-open spaces of true interoperability. But at the moment, much of what you'll find in these wide-open spaces is vaporware and hot air. There are many potential contenders and announcements, and precious few tangible landmarks.
In sorting out the various contestants, be prepared to wade through a numbing cloud of acronyms. Here's the quick summary (take a deep breath, or perhaps just skip over this paragraph). Microsoft has OLE (Object Linking and Embedding), which is based on its Component Object Model. In conjunction with DEC, Microsoft is integrating its object model with the DCE/RPC protocol to produce COM (Common Object Model). OLE will thus interoperate with DEC's ObjectBroker, based on the CORBA (Common Object Request Broker Architecture) specification from OMG (Object Management Group). Also based on CORBA are IBM's SOM (System Object Model) and DSOM (Distributed SOM). IBM is also backing a more ambitious object-computing effort by Taligent, which was initiated by Apple. HP recently bought an interest in Taligent and is also working with Next on PDO (Portable Distributed Objects). Next has announced it will work more closely with Sun in further developing the NextStep object model and promoting the OpenStep API. Sun is the developer of DOE (Distributed Objects Environment), part of Project DOE (Distributed Objects Everywhere), which includes DOMF (Distributed Object Management Facility). DOE is compliant with CORBA, as is Apple's OpenDoc compound-document technology, which Apple claims is supported by WordPerfect, Novell, Borland, and IBM. Apple also has a semirelated technology called the OSA (Open Scripting Architecture) Object Model. Got all that?
Because most everyone is talking in the future tense, the prospects for interoperability seem rosy indeed. Basically, everything is either based on, compliant with, or will interoperate with everything else, transparently, of course. But present-day reality is more opaque. Take OMG, for example. It was founded five years ago, in its words, "to create a standard that realizes interoperability between independently developed applications across heterogeneous networks of computers." OMG membership includes over "280 of the highest-octane minds from every single respectable software and computer company in the world" (according to an OMG advertisement), paying annual dues of between $5000 and $50,000. On the OMB board of directors are representatives from DEC, HP, IBM, NCR, Lotus, ICL, Siemens, Unisys, and Borland. OMG organizes three annual conferences (the Object World series) and publishes a glossy bimonthly magazine, First Class. OMG's CORBA 1.1 specification was released in 1990 and has found its way into several implementations.
Yet, although the CORBA 1.1 spec provides the theoretical basis for interoperability, actual interoperability won't be possible until the release of the 2.0 specification, which is still being written. OMG Director Jon Siegel explains: "Why didn't CORBA 1.1 standardize interoperability? [Because] OMG requires that technology be technically viable and commercially available before it can be proposed for standardization" (FirstClass 1/94).
Microsoft is seeking to take advantage of the disarray among this (mostly) UNIX crowd with its OLE2 technology. At the recent Software Development '94 conference in San Jose, Jim Allchin, vice president of advanced systems at Microsoft, emphasized that OLE2 "provides real value today." He also said OLE is "the most fundamental technology that is going to move the industry forward since Windows." Allchin added "the software component revolution starts today, officially," and reiterated that "OLE is the most significant thing since Windows."
Allchin demonstrated OLE objects interoperating between Windows 3.1 and Windows NT, and also showed the beta of OLE on the Macintosh. He attempted to show OLE on Windows 3.1 interoperating with a DEC machine running X, but--ironically--the software would not cooperate, so nothing happened. Allchin also announced the new OLE custom control (OCX) extension to the OLE architecture that provides the benefits of Visual Basic custom control (VBX) within the context of OLE technology. Development of OLE custom controls will be facilitated by an add-on to Microsoft's Visual C++ 1.5 package, which extends Microsoft's MFC framework to support OCXs and provides additional tools like a Control Wizard and an OLE Test Container.
Microsoft took the opportunity to tout the advantages of OLE over IBM's SOM/DSOM approach. In a printed handout, Microsoft says the IBM model is "an incomplete object solution." By themselves, SOM and DSOM do not support compound documents (this happens at higher layers of the software architecture). SOM does not support distributed computing; DSOM does support it, but requires source code and binary changes to SOM objects. Further, the "SOM/DSOM model allows objects to inherit source-code implementations from other objects through uncontrolled class hierarchies." Although this kind of inheritance can facilitate software development of objects, "it does so while sacrificing system robustness." Citing another shortcoming, "(SOM and DSOM) identify objects with simple names, not globally unique identifiers [which] may lead to naming conflicts in systems with many objects, and any system with objects supplied by different vendors." Further, they "lack a logical thread model to prevent object deadlocks, lack a security model, [and] lack robust object versioning control."
By contrast, in OLE, "an object can be shared by multiple applications at the same time across different address spaces." OLE's Component Object Model "defines a robust, binary standard that is type-safe." Objects which conform to the OLE model are "guaranteed to interoperate" and are "completely interchangeable and reusable without any recompilation." Naming conflicts in systems, even those consisting of millions of objects, are avoided via the use of GUIDs (globally unique object identifiers). Further, OLE provides "safe support for object upgrades (versioning)" and "protects against object deadlocks" via logical thread IDs which handle nonthreaded operating systems and threaded operating systems in a consistent manner.
Say what you will about Microsoft, its approach to the debate on object computing cannot be faulted. While other contenders hand out glossy brochures printed on heavy stock with a lot of white space and few details, Microsoft's handouts look like they were printed at the local corner copy place. Nevertheless they are long on content and focus on key technical issues.
In addition, OLE seems to be more real than its competitors. Microsoft claims that 1.5 million application packages supporting OLE have shipped, in addition to 8,000 OLE developer toolkits and 25,000 copies of Kraig Brockschmidt's book Inside OLE2 (shipping at a rate of 1000 per week), published by Microsoft Press.
Even so, OLE is not without problems. Jesse Berst, editor of Windows Watcher, summarizes OLE's shortcomings: "It is clumsy, cumbersome, complex, and incomplete." Paul DiLascia, writing in Microsoft Systems Journal calls it "a programmer's nightmare_an intimidating mess. No one knows the number of lines required to write an OLE 2.0 app, but I'm sure it's measured in the thousands." Kraig Brockschmidt, in his book, writes: "OLE is big. Very big. If you count the number of new functions in OLE 2, you have more than in Windows 3.0 itself."
It appears that people with a firm grasp of OLE are small in number, even at Microsoft. I asked nine people from Microsoft to answer some basic questions about OLE custom controls and could not get answers either at the press conference, booth, or at a private meeting. Finally, one senior technical manager provided educated guesses to the following questions: How many additional interfaces were added to OLE's 62 interfaces in order to support OLE controls? (Not many, about five or six.) How many lines of code did it take to implement OLE control support in MFC, in addition to the 20,000 lines that provide basic support in MFC Version 2.5? (Probably less than 10,000 lines of code). Are OCX events implemented via the IAdviseSink mechanism? (Yes.)
As an object-computing technology, OLE has some odd quirks. For example, you can never get a pointer to a whole object, only to a subset of its functionality (known as an "interface"). An object consists of a number of interfaces, but an object's client can never know exactly which ones; instead, a client can only inquire if a particular interface is available. Another quirk is that, rather than being created and deleted by the object's client, an object maintains its own internal reference count (and destroys itself when the reference count reaches zero). But it is the programmer's responsibility to increment reference counts each time a pointer to an interface is passed around (assigned to a variable). Finally, it turns out there is no real inheritance of behavior, only two related mechanisms known as "aggregation" and "delegation."
According to Brockschmidt, "OLE2 is the first step in the evolution of Windows from the function-call-based operating system we have today to an object-oriented operating system in the future." At the moment, it looks like the future object protocols will feel a lot like the present-day APIs: workable, but large, inconsistent, and subject to change.