Dear DDJ,
I was pleased to see the sidebar "Design by Contract" in the article "Implementing Assertions for Java," by Jeffrey Payne et al. (DDJ, January 1998). Design by Contract is an important -- yet long-overlooked -- technique with particular relevance in the development of mission-critical software.
What wasn't mentioned is that the concept dates back to 1985. That is when Bertrand Meyer, who invented the concept and coined the term "Design by Contract," introduced it as the basis for the Eiffel programming language. He also founded a software company -- Interactive Software Engineering Inc. -- around the Eiffel language, which adherents contend is more complete than Java, more robust, more versatile, and easier to code.
DDJ readers who want to learn more about this would do well to consult any of Bertrand's published references on the subject. In particular, two chapters in his most recent book, Object-Oriented Software Construction, Second Edition (Prentice Hall, 1997), devotes more than 100 pages exclusively to Design by Contract.
In addition, Meyer offers a wealth of related information and technical papers to interested software developers and others at http://www.eiffel.com/.
Alexander Prigozhin
Alexander.Prigozhin@eiffel.com
Dear DDJ
Al Stevens' recent discussion of why Standards take so long got me thinking. C finally got standardized just as C++ was taking off. C++ finally got standardized as Java is eating its lunch. Is there a trend here? Can anyone doubt that Perl will quickly be standardized shortly after Larry Wall starts programming in something else?
I postulate a group of nomadic language designers, slaving on the front lines trying to get new features into the current Standards effort, and, in the process, confusing and delaying the Standards process. When the next hot language comes along, this nomadic crew decamps, and the ordinary joes on the Standards committee quickly wrap up the design.
How would it be if a Standards committee agreed that any feature that could not be clearly understood and unambiguously described in six months should be summarily thrown out of the language? Clearly, anything that obscure would be a source of bugs, no matter which way the decision went.
I do see some light, however. As our development environments get more visual, syntax is becoming increasingly irrelevant. "File scope" and "header files" are nearly technologically obsolete in visual environments. As machines get faster, the ability to do semantic checking and transformation will also improve. Why shouldn't I write a sort routine using whatever syntax I feel most comfortable with, and use this to realize implementations in C, C++, Java, and so on?
I remember programming in 32 KB with amused amazement -- perhaps in 20 years we will take the same attitude toward these syntactic and semantic wrangles.
Steve Johnson
scj@transmeta.com
Dear DDJ,
I enjoyed Carol Jones' article, "The Java Internationalization API" (DDJ, January 1998). While the Java I18N API is nice for European languages and for east Asia, however, it is deficient with respect to right-to-left languages such as Hebrew and Arabic.
To provide proper support, each visible item (object, window, message) that may include text or other items must have a "base direction" property, either right-to-left or left-to-right. This property determines the layout of the text and possibly other UI features.
Visual Basic 5 has called this property "RightToLeft" and supports it with most of its controls. Java should do the same.
Jonathan Rosenne
http://ourworld.compuserve.com/homepages/Jonathan_Rosenne/
Dear DDJ,
Kenneth Price's letter in DDJ, February 1998, showed that we have common interests, using Ada for everything including missiles. I even use it as a hardware description language (HDL).
We probably have drastically different views of Ada-95. Ken said, "Ada came of age many years ago." It is now as good as dead. Even DoD abandoned it under the guise of COTS. The reason is adequately stated by Blake McBride in his response to the letter following yours. He said towards the end:
C++ becomes the Ada of the '90s. C++ has taken a small straightforward language (C), which has a relatively small learning curve, and created a language that has so much complexity and syntactical nuance that it takes years to learn and master.
I can almost afford to bet a nickel that both of you have not read and absorbed the impact of feature restrictions in Annex H for Safety and Security in Ada-95 LRM. What remains after all the restrictions are applied has demonstrated adequacy for any embedded applications and even for use as HDL. It is therefore a hardware/software codesign language (CDL) many only talk about.
It is almost impossible to find a forum that caters to readerships that designs computer languages, software, and hardware (chips). Very few people know that in 1980, the VHSIC conference at Woods Hole concluded that Ada was inadequate to be used as HDL, a faulty premise that launched the bastardized Ada called "VHDL." There was no Ada compiler in 1980 for anybody to claim experience [with] Ada. The extensions, as well much of the included Ada features in VHDL, are not necessary for use as HDL.
Most of the features restricted by Annex H may be called "dynamic constructs" that implied the existence of time-sharing operating systems. This is a result of decades of computer-science education propagating such premises without examining why? At a time a basic processor with program and scratch-pad memory can be put on a chip for around a dollar, making micro-time-sharing systems seems to be illogical. The point is lost to language designers that somebody has to do the low-level dirty work, such as with a Java engine. This resulted in today's dependence on the bloated Windows. The user no longer can control his own PC.
There is no open forum to debate whether functionality should be included in languages or as standardized application-specific packages that augment a small and simple core language. Maybe you, and especially Blake who pointed out the sins of language complexity, can call on DDJ readers to create a grass-roots working group on the Internet for Ada as totally restricted by Annex H. We can then freely examine the rationale, or rather the lack thereof, of many computer-science myths that were handed down from generation to generation and became dogmas. Examples are "a picture is worth a thousand words" and "concurrency (on a single computer?)."
Sy Wong
sywong@hermix.markv.com
Dear DDJ,
I just read Gerald Graef's review of "Graphical Applications with Tcl and Tk" (DDJ, February 1998) and would like to take the opportunity to correct an error in it. The statement that Tcl is the only language (save Java) that allows you to "create programs that can be independent of both graphical hardware and operating systems" is plainly false, as our product MetaCard has supported this type of cross-platform development for many years.
The MetaTalk language used in MetaCard is a superset of the HyperTalk language used in HyperCard, and as such, is much easier to learn and use than Tcl. Execution speed is also much faster, and you don't have to give up features like associative arrays and regular expressions as MetaCard has these built in, too. Also unlike Tcl, MetaCard comes with a complete graphical development environment (Tcl GUIs are created by writing scripts).
MetaCard is available for Windows 95/NT and all popular UNIX systems, and a version for the Mac is in public alpha-test and is scheduled to be released later this year. You can get more information about MetaCard and download the free Starter Kit version from http://www.metacard.com/.
Scott Raney
raney@metacard.com
Dear DDJ,
As much as I'm in favor of articles that cover mathematical themes, "Symbolic Integration using CLIPS," by John Swartz (DDJ, June 1997) gave the wrong impression about the state of the art in symbolic integration. Symbolic integration by following heuristics was state-of-the-art in the early '60s, but there is a better mousetrap called the "Risch algorithm." In a famous paper, Risch outlined an algorithm to solve the problem of symbolic integration. If there is an integral expressible in elementary functions, it outputs it, otherwise it says that the problem has provably no solution! Since then, there has been a lot of refinement and extension to algebraic functions (the transcendental case proves to be easier) and there is now a lot of literature about that in any better library. There is definitely no reason to do integration with heuristics. Commercial packages such as Macsyma, Reduce, Axiom, or Mathematica all use a variant of the Risch algorithm.
Andreas Eder
Munich, Germany
Andreas.Eder@mch.sni
Dear DDJ,
At the end of his article "Symbolic Integration using CLIPS." (DDJ, June 1997), John Swartz makes a comment that is misleading -- he assumes that there will always be a case that symbolic integration programs cannot handle. While this is true of the heuristic programs, such as the program he implemented, it is not true for algorithmic approaches. Yes, there is an algorithm to determine any antidertivative if it is expressible in terms of the usual functions (addition, multiplication, exponentiation, sine, cosine, log, and the like) -- if there is no such antiderivative, the algorithm says so. This problem is a mathematical problem that is a couple of centuries old. It was solved only recently by Risch in 1970. Currently, researchers are working on the problem of what happens when you allow more than the usual functions either under the integral sign or in the answer. Symbolic Integration I: Transcendental Functions (Algorithms and Computation in Mathematics, Vol. 1), by Manuel Bronstein (Springer Verlag, 1996, ISBN 3540605215) addresses this subject. (For a more accessible text that only mentions these results, see "Paradigms of Artificial Intelligence Programming: Case Studies in Common LISP" by Peter Norvig.)
Arthur Nunes
arthur@ccs.neu.edu
DDJ