Dear DDJ,
In "Self-Registering Objects in C++" (DDJ, August 1998), Jim Beveridge noted a couple of important restrictions and one linker/ loader dependency. We tried out the proxy class and specialty store code and it did not work for us, since our linker/loader was not setting the contents of our global variables to zero.
Ross Pettinger and I came up with what we think is a very good solution. It occurred to me that the only time a variable value could be depended upon to have a guaranteed value during the constructor of global variables was for the variable to be defined statically within a function with an initial value. Using this knowledge, we keep the information stored by the specialty store (we used a pointer to a linked list) as static variables within the Register() function, initialized to a sane initial value (a NULL pointer value in our case), and have the Register() function update a global variable, which is used at run time to find the registered classes. This technique proves to be portable and removes the dependency on the linker/loader.
Thank you for providing an excellent publication devoted to expanding the knowledge of software engineers. We are always talking about your articles when the issue arrives in the mail.
Aki Morita
amorita@ghg.net
Dear DDJ,
Regarding the point about bool Al Stevens made in his column on Standard C++ (DDJ, September 1998) -- it gets worse. I've found Win32 API functions that return BOOL but actually use it as int and return values other than 1 or 0. Sometimes those values even indicate different types of errors. Don't expect Microsoft to change this stuff, though. Win32 is primarily a C API and I don't see Microsoft going off and rewriting it. I find that it's better to bury any Windows code down below and write everything else as generically as possible. A better feature for Microsoft to implement would be namespaces so that all its stuff is in a Win32 namespace or something. So many preprocessor macros are used that the namespace is totally polluted whenever you include windows.h. Alas, the preprocessor is immune to namespaces.
John Burkhardt
Johnb@gamefx.com
Dear DDJ,
Tim Pfeiffer is correct in his "Online Op-Ed" that DLLs are abused and hence cause problems, but it's not DLLs that are the problem -- it's the people who install/ architect them.
The ability to upgrade DLLs instead of the whole enchilada is huge in certain applications due to the decreased size of upgrades. An example is an app I wrote that ran on mobile computers connected via wireless (read: slow) networks. Sending out upgrades to 100 mobile computers overnight would never complete if we had to ship the entire app every time.
Also, RAM efficiency is still a real issue to be dealt with on a lot of older/underpowered systems out there, and hence certain DLLs (like MFC, which everyone, including Microsoft uses) would cause a lot of unnecessary page faults if they were statically linked.
There is no question that the problems of orphaned DLLs and shared DLLs that break apps when upgraded is real, but much of this can be mitigated by better install/uninstall programs and more judicious placement of shared DLLs within the directory structure of your application (so that your app(s) are the only ones sharing it). And while I am still holding out on the promise of COM, there is no question that this technology solves the shared DLL problem quite nicely.
You're right Tim, DLLs are abused in Windows. But you'd do the subject more justice by taking a little less harsh stance on something that does have some merit.
Steve Kelleher
stevek@spectralogic.com
Dear DDJ,
In his "Online Op-Ed" entitled "Windows DLLs: Threat or Menace" at http://www/ddj.com/, Tim Pfeiffer's point is well taken and I quite agree. I am a programmer and also perform LAN/software support. Windows 95 is the most unstable OS I have ever worked with. This is no isolated condition. Other support personnel relate the same opinions. Considering that it can perform quite well in a stable environment, with no software additions or modifications or updates, I have to conclude that DLLs may well be a menace to a stable configuration. Like DLLs, the dynamic nature of Windows 95 or (shudder!) Windows 98 may have been well intentioned, but not necessarily very well thought out. Imagine the conditions that would result from programming with Units that might change or be modified by third parties, without notice. Add to Tim's arguments the fact that substantial error handling must be included to detect incorrect or missing DLLs and I must agree. Programs would be much more stable and reliable without the use of the ubiquitous DLLs.
Mike Kelley
mjkelley@advaudiodev.com
Dear DDJ,
Upon reading the article "Date Compression and Year 2000 Challenges, by Robert L. Moore and D. Gregory Foley (DDJ, May 1998), I unfortunately saw no mention of a solution that can even work on embedded processors called "Encapsulation."
Encapsulation involves setting the date that the processor or app is exposed to back to a date before 2000 that it can handle. Incoming data is set back date-wise via a filter. Outgoing data dates are set forward via another filter. This keeps the non-Y2K app happy while it continues to function.
Consider this: How would you be able to get a life-support system in a hospital to continue to function if it contained a non-Y2K-compliant processor that made it simply lock up. The answer is simple: Shut off the machine, pull any battery backup, put the battery back in, turn on the machine, and initialize it to have a pre-2000 date (just like you might do if it came right out of the box). The machine will have incorrect date stamps, but continue to support life. This is what encapsulation is. The only alternative in this case is to replace the unit especially if the non-Y2K-compliant programming is on two soldered-in ROMs. For more information on encapsulation, see http:.//www.2000technologies.com and http://www.jks.co.uk/y2ki/.
Jeff Feeley
jfeeley@zenith.att.com
Dear DDJ,
I was disappointed to read Peter Roth's review of my book, The Design and Development of Fuzzy Logic Controllers, in the May 1998 issue of DDJ. I feel that the coverage of my book was simplistic and harsh. Although there is an editorial error in the graphs on pages 2 and 3, which has been corrected in a subsequent printing, I feel it does not detract from the overall content of the book.
Readers who take time to go beyond a cursory inspection of the book would understand what Mr. Roth calls an "unintelligible 3D graph" (page 36) is used as an aid to interpret the FuzzzyStat system's behavior. The graph is explained previously on page 33, using terminology defined earlier ("Control Surface Plots," page seven) in the book. Similarly, the "several unintelligible input files" referred to in the review are used by the TeachFuzz fuzzy logic simulator, which is discussed on page 24.
Further, what Mr. Roth refers to as "opaque graphs" are actually the accompanying control surfaces for each of the simulated models. Keep in mind, each fuzzy logic simulation tool has its own unique input file format and the TeachFuzz fuzzy logic simulator is no different. For example, Motorola's Fuzzy Inference Development Environment (FIDE) has a similar structure, requiring multiple input files.
Additionally, I would disagree with Mr. Roth's criticism of the fuzzy logic development cycle (Figure 3.2, page 45). Although the explanation of this diagram is abbreviated for the reader, it is truly a spiral model, not a waterfall model as he states. The difference being that a traditional software waterfall model feeds back to the preceding phase rather than feeding back on itself, or all the way back to the requirements phase. I would refer Mr. Roth to the book Software Engineering by Stephen Schach, as cited in the book's references, for a more detailed explanation of the software development lifecycle model.
Mr. Roth does point to several strengths of the content of the book, including the number of interesting fuzzy logic controller projects presented in the book and the "clear survey" of fuzzy logic. Given these comments, his closing statements about the book are puzzling at best, and highly inaccurate at worst. I encourage readers to evaluate the book for themselves.
Byron Miller
impub@isd.net
Dear DDJ
In "Implementing Assertions for Java" (DDJ, January 1998), Jeffery Payne, Michael Schatz, and Matthew Schmid describe a means of implementing assertions in Java. It is commendable that through their article, they help raise awareness of an important means for building reliable software, known as "Design By Contract." However, in an unfortunate oversight, they failed to attribute the phrase to its author -- Bertrand Meyer. In numerous articles and books (notably Object-Oriented Software Construction), Meyer convincingly demonstrates the benefits that software contracts bring to specification, design and testing.
In the same article, the authors also listed desirable assertion capabilities, and, since Java offers no support, described how they had created tools to provide them. Again, they failed to mention that a language already exists with such capabilities built in, namely Eiffel, designed by Bertrand Meyer. DDJ readers concerned about the software quality issues addressed by the article may be interested in knowing that Eiffel is a reliable software technology that has been around for more than 10 years. Since Eiffel's assertion language is part of its international standard, developers wishing to apply the benefits of "Design By Contract" need neither wait for changes to a proprietary language standard nor purchase third-party tools in the interim. Instead, they can use Eiffel today.
Ted Velkoff
velkoff@erols.com
DDJ