Letters

Dr. Dobb's Journal February 2001

Analyzing Algorithms

Dear DDJ,

In the January 2000 "Letters," Todd Stephan claims that you can leave out the sqrt when solving the traveling salesman problem. Unfortunately, this isn't true. Here's why. In effect, by saying the sqrt is redundant, you are suggesting that a^2+b^2>c^2+d^2=> a+b>c+d (i.e., if the sum of the distances squared is greater, then so is the sum of the distances). But consider a=1, b=2.7, c=d=2. a^2+b^2=8.29 and c^2+d^2=8. 8.29>8. So far, so good. But 1+2.7=3.7< 2+2=4. Oops!

Ben Laurie

ben@algroup.co.uk

Child's Play

Dear DDJ,

In regard to Michael Swaine's "Programming Paradigms" column entitled "Childhood's End" (DDJ, November 2000), I agree with the idea that the children of tomorrow will adapt to an entirely new world through sophisticated interactive toys, leaving us nonchildren far behind. This is somewhat evident, even today. The sophisticated CAD software of the previous decade has become the "Crayola 3D Castle Creator" that my preschool daughters fool around with on the computer (actually, I play with it more than they do — perhaps there is hope for nonchildren after all). But no matter how intricate or lifelike the toys of tomorrow may be, I cannot help but believe that the children of the future would still find more interesting things to do with the cardboard box the toys are delivered in. Some things will never change.

James Metzger

j.r.metzger@ieee.org

Hurd, Hurd, Hurd — Hurd Is the Word

Dear DDJ,

Jerry Epplin's excellent article on "The HURD" (DDJ, December 2000) seems to be missing one point that I think is important. He mentions that there are a number of "toy" microkernel-based operating systems that have been released over the years, but he doesn't mention the other large-scale microkernel OS projects, which are more similar to the HURD. Most Mach-based OSs included a complete BSD engine, and in that respect, the HURD and these other OSs are similar in that they all run UNIX. The difference is an important one though.

In the past, the kernel was seen as the only interesting area for research, likely as a side effect of it being a highly academic effort (at least to start). Although the goal was to help OS development, the benefits a microkernel offered to OS developers were unrealized unless the OS was modularized. But that was largely seen as work for someone else to do, and in the end, no one ever seemed to get around to it.

UNIX was ported with just enough changes to get it running, because it was being included largely for utility value to support kernel development. Over a decade later and UNIX on Mach is largely as monolithic as the original UNIX that spawned them — perhaps even more so due to a lack of maintenance. To illustrate this point consider MkLinux, which was a port of Linux in essentially unchanged form so it would run on the Mk kernel. There's no advantage to this version, and it runs slower too.

The reason I find the HURD compelling is that it's the first concerted effort to close the loop and actually fix the OS itself. In the HURD the kernel is of secondary importance, the modularization of the OS itself is the key reason the project exists. It's only with the HURD that the advantages of the microkernel Jerry outlines in his article will become fully realized.

Maury Markowitz

maury@fintech.com

Software in the 21st Century

Dear DDJ,

In regards to Eugene Kim's article "The Future of Programming" (Dr. Dobb's Special Report on Software in the 21st Century" (December 2000): The problem is not that programmers on [the] whole are lazy, impatient, and arrogant — it is that there are increasingly more mediocre programmers. To imply that all programmers fit into this derogatory species is both insulting and erroneous. Subsequent conclusions based on erroneous assumptions are also erroneous.

Rather than relying exclusively on Larry Wall's view of programmers, readers may want to consult a leading expert in the field, Richard W. Hamming, in particular his chapter "How To Think About Trends" in Beyond Calculation: The Next Fifty Years of Computing (ISBN 0-387-94932-1).

Anybody can program a computer. With today's visual programming, rapid application development, wizards, and the numerous scripting languages (like Perl), more and more people are becoming mediocre programmers. Because you can program a computer does not mean you should do it for a living.

While [it] is true that "tools and good practices will make programming easier," Hamming suggests, "until we have languages that help us think about the original problem and its proposed algorithms, there will be only slow improvements in programming effort."

Edward Harned

ed@coopsoft.com

Analyzing Analytic Computing

Dear DDJ,

Laurent Bernardin's article on "Analytic Computing" (DDJ, September 2000) gives an interesting overview of the synthesis of numerical and symbolic computation using Maple.

Regarding his first example, I recently became aware of a different recursive computation for terms in the Fibonacci sequence. (This was shared by a colleague, so I don't have a reference for the description below. I did locate a different derivation of essentially the same result — where else but in Donald Knuth's Art of Computer Programming Vol. 1).

The recursion is defined by:

F(0) = 0;

F(1) = 1;

F(2) = 1;

/

| F((n+1)/2)^2+F((n-1)/2)^2 if n is odd

F(n)=/

\

| F(n/2+1)^2-F(n/2-1)^2 if n is even

\

This recursion has depth proportional to log_2(n) rather than n. Whereas the iterative computation of F(100000) takes almost exactly 45 seconds on a 650-MHz Pentium-III (just as Bernardin predicts), the recursive computation above takes about 15 seconds, even without "options remember." If options remember is used, F(100000) is computed almost instantaneously.

To see why this works, you can first derive that the nth Fibonacci term (suitably reindexed) counts the number of binary strings of length n with no repeating ones. To see this, first observe that there is one such string of length zero (the empty string has no consecutive ones) and two such strings of length one (0 and 1). Now suppose that the result holds for strings of length k or less. You can extend a string of length k by adding a bit on the left end. How many strings of length k+1 with no consecutive ones are there?

If the new left-most bit is 0, than the rest of the string can be any of the strings of length k. If the new left-most bit is 1, then the following bit must be 0 and the remaining bits can be any of the strings of length k-2. Thus, the total number of strings of length k+1 is F(k+1)=F(k)+F(k-1) — the Fibonacci sequence.

Now consider a different way of computing the count of such strings. Consider odd-length strings of length 2k+1. You can count the number of strings without consecutive ones as follows: Look at the middle bit. If this bit is 0, then you can construct acceptable strings using any acceptable string of length k on the left and any acceptable string of length k on the right. If the middle bit is 1 then the bits on either side must be 0, and the remaining bits can be taken from any pair of acceptable strings of length k-1. So the total number of such strings is F(2k+1)=F(k)^2+ F(k-1)^2. For even-length strings, you take one of the two middle bits and follow the same logic to get F(2k)=F(k) F(k-1)+ F(k-1)*F(k-2). Observing that F(k-1)= F(k)-F(k-2), you see that:

F(2k)=F(k)*[F(k)-F(k-2)]+[F(k)-F(k-2)]*F(k-2)

=F(k)^2-F(k-2)^2.

Renaming the indices appropriately gives the formula above.

This idea illustrates that recursive computation can be much more efficient than iterative computation. The trick is to use the recursive viewpoint to find ways to divide the work of computing a particular result into subtasks that are different from those that come to mind when one thinks about iterating. Other results in the same vein include Karatsuba's method for multiplying polynomials, Strassen's method for multiplying matrices, and the Fast Fourier Transform.

Matthew Saltzman

mjs@clemson.edu

Giving Boost a Boost

Dear DDJ,

I thoroughly enjoyed the concepts of Boost (see "C++ Type Traits," by John Maddock and Steve Cleary, DDJ, October 2000) and took particular pleasure in grappling with how the implementation actually worked. Unfortunately, I now need to go and have a good lie down. I suggest that I was lucky to survive, and that a safety warning be inserted into the source code: "These templates were created by trained professionals in controlled conditions. Do not try this at home."

Claude Brown

claude@earthling.net

MP3 Notes

Dear DDJ,

In regards to Al Stevens's "Into the World of MP3" article: I have never had any MP3 software crash on me in Linux. The list of MP3 toys I have played with is long: mpg123, xmms, bladeenc, lame, grip, and so on. I have lots of CDs and have almost converted them all to MP3. I just load one in, select all tracks, and hit the rip+encode button. When it's done, it opens the CD drawer, I switch disks, lather, rinse, and repeat. Occasionally, the disk is not in the CDDB database, and I have to manually enter disk/track/artist information. But I do not ever remember having the software crash. I'm probably lucky, but...Glad I'm not using Windows! Thanks for the informative article.

Robert Wuest

rwuest@wuest.org

DDJ