Dear DDJ,
Eric Zapletal makes some good points ("Letters," DDJ, August 1994). However, as a software developer with an EE background, I am qualified and compelled to respond to his criticism of programming languages. I believe Eric is correct in his assertion that a schematic representation would be far more productive for RLU programming than traditional languages; he is incorrect, however, in implying that schematic representations are inherently superior to programming languages in general. Eric states:
A circuit schematic is 2-D, and it is understood that you can read (or look at) any part of the schematic in any order. For a language [presumably he means "program"] to make sense, you must start at the beginning and work steadily through to the end (clearly, programs don't run steadily from BEGIN to END--the main reason why languages are not suited to programming).
For starters, the comparison itself is flawed; he is comparing the process of understanding a single part of a schematic with the understanding of an entire program. Furthermore, any part of a program (or at least a well-written program) may be viewed and understood separately from the whole. That is a key principle of virtually every programming methodology--schematic or language based. (And where is it written that because programs do not run steadily from BEGIN to END, languages are not suited to programming? Indeed, that is quite a leap in reasoning.)
I think it true that a visual methodology for language-based programming would be more productive for some tasks than some current methods, and the evolutionary direction of certain software-development tools supports this. However, until the field of software engineering matures, I do not believe this will be entirely possible. The very notion of a simulation environment implies a well-defined number of tightly controlled parameter inputs and outputs. A physical component can be modeled as a truth table, transfer function, or appropriate metaphor on a diagram--can the same be said for a function? Certainly a great number of common algorithms have come into widespread use in the programming community, but is any significant percentage thereof truly standardized? I think not.
With regard to Eric's questions about the existence of 32- and 64-bit software, there is a paradox of inertia involved. The primary motivating force behind most product development is sales. Why should one develop software for a 2x-bit platform when the market for x-bit software is much more lucrative? It is in this manner that hardware stifles software. Paradoxically, it is the enticement of more powerful software that generally moves the installed base to upgrade. So, what really comes first, the upgrade or the software?
Concerning bugs becoming a "programming badge of merit," I share Eric's opinion that this is disgraceful. I have, however, never personally met a developer whose goal was to generate bugs or wear them as a badge of merit. I have, unfortunately, met developers who seem to share Eric's implied conviction that perfection is possible. I am sure that virtually every nontrivial piece of code I have ever written has a hidden bug somewhere, but I take no perverse pride in this; it is simply a painful acknowledgment of my flawed and fragile humanity.
In the same way that different programming languages lend themselves to different tasks, different development methodologies also lend themselves to different products. I continue to enjoy the process of learning new languages and learning to best differentiate the class of problems to which a particular language is best suited. In summation, Eric's comments remind me of the old adage about one's possession of a hammer so inclining one to (myopically) view each new task as just one more nail to pound_.
John B. Williston
Plainwell, Michigan
Dear DDJ,
In his "Editorial" on the U.S. Postal Service (DDJ, June 1994), Jonathan Erickson falls into some common misconceptions about the Post Office. He states that the Postal Service is stuck between universal delivery of the mail and (often cheaper) competitors who can pick and choose where to deliver.
Jonathan should check with some of these competitors. UPS delivers to every address in the United States. They also deliver to every address in many European and east Asian nations. I think Federal Express also delivers to every address in the United States.
UPS and Federal Express do enjoy the advantage of primarily serving the business-to-business markets of express and package delivery, but this is because of Postal Service monopolies which prevent competitors from delivering many types of mail.
The reason private companies provide universal delivery even when not required to is quite simple. If a private delivery service didn't provide universal delivery, shipping a package would involve checking lists of destinations to decide who delivers where. It's simpler to go with a single shipping service--adding a little to delivery costs (to get to remote areas) increases the volume of business immensely.
This translates into the "information superhighway." Congress is being lobbied to legislate universal, subsidized access to "worthy" causes. The incentive exists to provide universal access without government mandates since anything less results in a "look up how to send the information" problem. It's one reason why the major online services (CompuServe, MCI Mail, America Online) are all connected to the Internet: It avoids a question of which service to log onto and permits me to send this letter electronically even though I don't have an account on any of the online services.
Thomas Wicklund
Longmont, Colorado
DDJ Responds: Thanks for your letter, Thomas. You're right. It would have been irresponsible for me not to check with competitors to the U.S. Postal Service to get their side of the story--that's why I called both Federal Express and UPS. According to the spokesperson I talked to, not only is FedEx barred by law from competing with the Postal Service in the home-to-home market, they have no interest in doing so.
Dear DDJ,
In his article, "IPC: UNIX vs. OS/2" (DDJ, May 1994) John Rodley did an excellent job of comparing the UnixWare and OS/2 approaches to IPC. In particular, his use of analogies made the article very clear. This issue of DDJ was timely for me, because I'm involved in a project that requires portability between UNIX, OS/2, and Windows, and relies on shared-memory IPC.
Also, I'd like to mention that I ported the example code to Linux with only a single change in one #define.
Thanks for this interesting issue. An article on comparing streams frameworks on different OSs is welcome!
Carlos Crosetti
Buenos Aires, Argentina
Dear DDJ,
I just read Reg Charney's article, "Data Attribute Notation and C++" (DDJ, August 1994) and must comment that his approach "feels" very right. As he points out, the idea of encapsulating attributes in their own class allows a close match with systems design and can ensure that the application has a consistent method of handling attributes across all classes that define that attribute (a rudimentary data dictionary). I look forward to trying this method on my own projects.
James Mitchell
Auckland, New Zealand
Those Installation Blues
Dear DDJ,
I read about Al Stevens' experience with OS/2 with significant empathy ("C Programming," DDJ, August 1994). I feel vindicated in my decision to not load OS/2 2.x on my system at all. Back when IBM had the $49.00 upgrade offer, I bought a copy. After I read the installation directions, I decided it wasn't worth the hassle, and I gave my copy away. I am also a professional programmer, and I had only curiosity to satisfy. I didn't feel that satisfying my curiosity was worth the risk of trashing my system and having to restore it all from back-ups.
Meanwhile, quite a bit of time has passed, and I have decided to push my way through the difficulties and try Windows NT and Coherent, Mark Williams' UNIX clone. I thought I had a difficult time with Windows NT, but it wasn't quite as bad as Al Stevens' experience with OS/2. I recently purchased a 2GB Seagate Barracuda hard drive and a WangDat 3200 for backup. Part of the reason for the extra disk space was to have some room to play with things like Windows NT and various programming languages and development environments.
First, I backed up everything to a DAT tape, and then began to install Windows NT. I had both a CD-ROM and a set of disks. My CD-ROM came as part of a SoundBlaster Multimedia kit, and so was not directly supported by Windows NT. Nonetheless, there were instructions for installing using a nonsupported CD-ROM. I followed those instructions. The installation program asked me where to put the Windows NT files, and since the 200-Mbyte IDE C: drive was nearly full, I specified the E: drive (second 500-Mbyte logical drive on the Barracuda). Well, the end result of this was that I lost everything on the 2-gigabyte drive. Windows NT and DOS had some kind of disagreement about which drive was the E: drive, and I had to use my backup tape. (At least that worked fine!) So I moved lots of stuff from the C: drive to the D: drive to make room for Windows NT, backed up again, and tried again. This time everything loaded fine. I took the default 640x480 video configuration, planning to follow the directions and change to a higher resolution after installation. When I tried to install a higher-resolution driver, Windows NT asked for an installation disk, but refused it when I put it in the floppy drive. It also refused to take it from the CD-ROM, even thought the CD was accessible via the driver I had downloaded from Creative Labs' BBS. I didn't want to fuss with the stack of floppy disks, so I decided to live with 640x480 for a while. I noticed that NT was having problems with my 90-Mbyte Bernoulli drives. They would spin up and down repeatedly for 15 minutes or even longer before NT finally decided it couldn't tell what file system was installed. It booted more quickly when I turned off the Bernoullis. Meanwhile, I didn't try to do any serious work with NT.
Time passed. I replaced my Trident SVGA card with a Hercules Dynamite Pro, and my Adaptec 1522 SCSI card with an Adaptec 1542. I decided to bite the bullet and install NT from the floppies so that I could use 1024x768. I made the mistake of asking for 1024x768 during installation, and ended up having to repeat the first part of the installation process. Finally, I was able to install a 1024x768 driver, but I still had the Bernoulli problem. Now, however, I couldn't simply turn off the drives to get NT to boot, I also had to reconfigure the AHA-1542 to supply a termination. Also, NT failed to properly migrate my windows desktop to NT. It completely missed a few groups, and in the group it did get, it initially set all the Icons to a question-mark icon. I found that by selecting an icon and hitting Ctrl-Enter followed by Enter, NT would then find the correct icon. However, my Microsoft Office group was nowhere to be found--in a strange sort of poetic justice, it was only a couple of groups of Microsoft applications that failed to migrate from Windows. Eventually, I stumbled onto the solution to the Bernoulli problem. The AHA-1542 ROM setup has an option to "Send Start Unit Command." The default is disabled. When I enabled the option for the Bernoullis, NT booted with no problem. So far, I haven't been able to make NT crash, but then, I haven't tried very hard, either. I suspect that it is more bullet-proof than OS/2 per Al Stevens' experience.
After I realized that Coherent required a separate hard-drive partition, not merely a logical drive in an extended DOS partition, I decided not to install Coherent on my main system, but to use another computer, at least initially, to avoid reorganizing my 2-gigabyte hard drive. I purchased an additional 420-Mbyte Connor IDE drive for less than $250.00 and managed to get it working as the master with my old 212-Mbyte Connor IDE drive. I was then able to install Coherent, with the assistance of a couple of tech support calls. It seems that both Coherent and Windows NT are more picky about hardware than DOS because they access it directly without using the BIOS. I can no longer reboot the computer with Coherent on it by pressing reset. I have to turn off the power, and then power on again. This is apparently some kind of inadequacy in the chipset or BIOS. The Coherent tech-support person was competent and helpful, and I didn't have to wait long on hold. (I'll give you one guess why I haven't even bothered to try to get Microsoft tech support on the phone.) Meanwhile, I realized that X Windows support could barely limp along in 4 Mbytes of RAM, so I ordered some more RAM, and decided to leave Coherent alone for a while.
It is certainly not "love at first sight" with Coherent. It really does act like UNIX, with all the user-unfriendliness included. However, Coherent is an inexpensive way to learn something about UNIX. Hopefully, the educational value will be worth the trouble. I'll know more when I fiddle with the X Windows stuff and the C/C++ compilers.
Daniel E. Hale
Anaheim, California
Dear DDJ,
In Michael Swaine's interview with Lee Buck ("Programming Paradigms," DDJ, August 1994), Lee says: "Call me silly, I shouldn't have to spend a lot of time hitting tab. I just think that's stupid." I'm calling Lee silly. Doesn't he know about indent?
Indent is a BSD program which has been around for about 18 years, is currently in the GNU suite, and can reformat C code in a wide variety of formats. I never found the time I spent formatting code to be a waste. (I never have used a context-sensitive editor--it might be nice to use something where I would hit a key and get a new function to fill in.)
Marty Leisner
leisner@sdsp.mc.xerox.com
Copyright © 1994, Dr. Dobb's Journal