Books and Covers

Michael Swaine

When I was in junior high, my English teacher, Mrs. Healey, once asked the class if there was any legitimate way to judge a book by its cover. Surely not by the cover art, I thought as I scratched my flattop (http://www.cruzio.com/~mswaine). So what else is on the typical book's cover?

The price-definitely a criterion to somebody on an allowance. The author-in those days I was reading everything by Arthur C. Clarke, but it would be a few years before I would appreciate Phillip K. Dick. The title-which ought to give you a clue about whether the book is for you. (Let's not think about what that means about the purchasers of the hugely popular For Dummies books from IDG Books.)

We offered all these, but Mrs. Healey wanted more.

Then Linda Bolster raised her hand and said that it might sound crazy but she evaluated books by their publishers.

Yeah, it sounded crazy. But what it really sounded like was some pathetic twerp trying to say what she thought the teacher wanted. Poor Linda. We made her suffer for that.

It took me years to figure out that I do judge books, at least in part, by their publishers, and over the years I have had my favorite book publishers. But I never would have suspected that Springer-Verlag would become one of them.

Opus 1000

Springer has a special role among computer-book publishers. Some publishers are on top of the latest trends, like the Waite Group and the Coriolis Group. Others publish the prestige books, like Addison-Wesley. Still others publish for dummies. Springer is like the Internet: There's a lot of stuff out there you're never going to read, and a few things you must read; you can dive to your chosen obsessive level of detail in your favorite subject, and a lot of it reads like a transcript of a report delivered at a technical conference yesterday afternoon.

Besides all that, Springer is responsible for what are, hands down, the most consistently boring covers in computer publishing in the last 20 years. A shelf of Springer books, all steel grey, makes me think of plumbing. Indeed, a Springer book looks like something you'd feel right about picking up when you're up to your elbows in garbage. (Ah yes, plumbing metaphors. I had occasion, years ago, in discussing some Peter Norton utility, to comment on the aptness of his naming his company after the world's most-famous fictional plumber, Ed Norton. Many programmers are, in some sense, in the plumbing business, and a lot of us spend a lot of time up to our elbows in garbage. So on second thought, maybe the Springer covers are more appropriate than the flashy Whatever Group covers. (The world's second-most-famous fictional plumber would be hard to name, so I arbitrarily anoint Harry Tuttle, the Robert DeNiro character in the movie Brazil. Appropriately, there is a software company named after Harry Tuttle, too.)

If you aren't up on the subject matter of the Springer books, they often appear to live down to the boring covers. No doubt Petrie nets are fascinating once you get to know them, but they are a closed book to me. The many books on parallel processing and artificial-intelligence topics do speak to me, though.

The longest-running series of Springer books is the "Lecture Notes" series. It's up to 1000 volumes now, and includes among its prizes Jensen and Wirth's Pascal: User's Manual and Report. The editors decided to congratulate themselves with a celebratory 1000th volume addressing the state of computing today. They went all out and covered the usual steel grey cover with a special steel grey dust jacket.

The State is Uncertain

The book is Computer Science Today: Recent Trends and Developments and it's edited by Jan van Leeuwen (1995, ISBN 3-540-60105-8). The lead article "A Quantum Jump in Computer Science," might remind you of some of the more speculative essays I've written in this column. I try to jump ahead of the curve now and then. So, apparently, does Giles Brassard, a professor at the University of Montreal and the editor-in-chief of the Journal of Cryptology. He's one of the pioneers in the fields of quantum cryptography and quantum teleportation. Yup, quantum teleportation. It's all based on quantum-information theory.

Quantum information is measured in qubits rather than in bits. Or maybe I should say they're quantified in qubits; the word "measured" is dangerous to throw around loosely in quantum disciplines. Put it this way: A qubit is the unit of quantum information, which has properties quite different from those of classical information. To visualize a qubit, picture a sphere in complex space with its north and south poles at 0 and 1, the possible values of a classical bit. The possible values of the qubit lie on the surface of that sphere.

Quantum computing is an emerging discipline that attempts to describe the behavior of computers built on quantum principles. All computers today do depend on some quantum phenomena, but quantum computers use quantum principles more intimately. Where a conventional computer program chooses between states and follows one computational path, a quantum computer takes all possible paths simultaneously.

Because of the peculiar role of measurement in quantum physics, quantum computing is pretty challenging. Some examples: You can't measure qubits reliably, the very act of measuring a qubit affects the state of the qubit unpredictably, and what you get when you do measure a qubit is just a classical bit after all. Also, qubit math is peculiar: It is sometimes possible to extract from two identical qubits more than twice as much information as you can get from either one alone. This sort of behavior is just what makes quantum cryptography possible, though.

At least Brassard says it's possible, and that possibility has some interesting consequences. For starts, both secret-key and public-key cryptographic systems will fall apart when the first quantum computer is built, because quantum computers can extract discrete logarithms in polynomial time.

But that's all right, because quantum-information theory gives us even better cryptographic systems based on quantum cryptography. Brassard has designed protocols that can be used to implement a key-distribution system that is provably secure even against an eavesdropper with infinite computing power. It involves exchanging "very tenuous signals" as he modestly puts it-on the order of a tenth of a photon per pulse. Brassard says that several prototypes for quantum cryptographic systems have been built, including one that is fully operational over 30 kilometers of ordinary optical fiber.

Quantum Computers

As for quantum computers, their implementation faces some pretty serious challenges: Quantum information has a way of disappearing, and quantum calculations can spontaneously start running backwards. Nevertheless, researchers are hard at work on prototypes of quantum-exclusive gates based on several approaches, including atomic interferometry and microwave cavities capable of trapping single photons. That's a logical place to start. A working quantum XOR gate would be a big step toward a quantum computer, since it has been shown that, just as with classical computers, all you need to build a quantum computer is a lot of XORs. In principle.

And they're getting close: One team of researchers has actually built a quantum XOR gate that worked 70 percent of the time. Quantum computers would, in theory, be able to perform certain important classes of calculations exponentially faster than conventional computers. So they're worth exploring even without the teleportation. But programming a quantum computer would require a real paradigm shift. I gather that the problem of spontaneously reversing calculations is not something that can be fixed, so what you'd have to do is learn to write programs in such a way that you still get the right answer if a calculation suddenly starts running backwards. At least I think that's what you'd have to do. And quantum teleportation-but I should leave some surprises for you in case you want to read the article.

Is Parallel Paralyzed?

There are several articles in this book on parallel computing, a subject that I devoted a lot of ink to some years back in this column. Parallel computing represents a true paradigm shift, a break with the von Neumann model that has dominated computing since the beginning, requiring new machines, new architectures, new languages, new algorithms, and new ways of looking at problems. Research and development has been going on in all these areas, and there are many specialized domains of computing where parallel methods and machines have been employed successfully, but that major paradigm shift hasn't taken place and doesn't-at least if you're not in the fray-seem any closer today than it did a decade ago.

I read these articles expecting to be brought up to date on the current state of the art, but I came away a little confused. I guess whether you're depressed or optimistic about parallel computing depends on what corner of the field you want to play in.

In "Quo Vadetis, Parallel Machine Models?," Jiri Wiedermann (vice director and head of the Department of Theoretical Informatics at the Institute of Computer Science of the Academy of Sciences of the Czech Republic, if you will) writes of a "parallel computing crisis" much like the crisis in artificial intelligence engendered by overpromising. He's the expert, but I don't really think the expectations for parallel computing were ever as high as for AI, at least among the general public, and I don't see evidence of huge research budgets suddenly drying up the way they did for AI. That parallel computing was oversold and that it has, in some ways, disappointed expectations, I don't challenge. And Wiedermann gives a nice account of that history.

A more encouraging view comes from articles like Lawrence Snyder's "Experimental Validation of Models of Parallel Computation" and "Scalable Computing," by Bill McColl, where you see a paradigm creeping closer to some critical watershed, beyond which efforts will start flowing ever more rapidly toward practical applications.

Snyder teaches at the University of Washington and is the inventor of the Configurable Highly Parallel architecture (CHiP), the Poker Parallel Programming Environment, and Chaotic Routing. He writes of having reached "the beginnings of 'daily experience' validation" of parallel models. McColl describes scalable computing, a parallel model that he flatly states will become the normal form of computing over the next few years. McColl is the Chairman of Oxford Parallel, a parallel-computing research center at Oxford University. What he's talking about is not exactly the kind of parallel computing that Wiedermann is talking about, and his bold assertion about scalable computing taking over in the next few years is apparently based on the multiprocessing support in the Intel P6.

Some of the problems of parallel computing have already been solved. In "Algebraic Topology and Distributed Computing: a Primer," Maurice Herlihy (Brown University) and Sergio Rajsbaum (National University of Mexico) describe how topological methods dating back to Poincar and the beginning of topology can be applied to distributed-computing models. They present a number of topological truths about such systems, such as "Read/write complexes have no holes."

Is there any overall conclusion to be extracted from these articles about the current state of parallel computing? Well, since parallel languages and algorithms will depend on the machine architecture, and since that's still an open issue, I'd say Miller Freeman won't be launching a magazine about mainstream parallel computing next year. Maybe in '98?

Highly Logical

On that shelf of grey books I also have a new book called Prolog: The Standard, a Springer opus by Pierre Deransart, AbdelAli Ed-Dbali, and Laurent Ceroni (1996, ISBN 3-540-59305-7). The standardization of Prolog is a significant event in the history of computer science, a major event in the history of logic programming, and a surprising event to those whose experience with the many dialects of Prolog and the deep unresolved questions about basic constructs of the language have led them to think there could never be a Prolog standard. Prolog has some shortcomings: It doesn't interoperate well with other languages; it lacks modules, explicit control directives, and decent development environments; and there are too many dialects, mostly developed at universities to university standards rather than to commercial standards. (Turbo Prolog was developed to commercial standards, but it came with its own list of shortcomings.)

Standardization would be a boon to those trying to remove these shortcomings, and would directly attack the multiple-dialect problem. That's why the standard is good news-but it was a hard battle, as language standards efforts usually are, in part, perhaps because they are undertaken, as the authors point out, "by experts in the language, but novices in standardization."

The book lays out the standard, but it skims over some issues that the authors didn't think needed to be spelled out, such as the arithmetic spec. There is a web site devoted to the standard (http://www .springer.de/) and you can download an executable version of the spec from there, as well as a lot of examples.

The fact that the spec is executable stems from the fact that Prolog is a language for doing logic (PROgrammation en LOGique is the derivation of the name). The spec is effectively written in Standard Prolog, which probably raises knotty questions regarding the provability of its correctness and so forth, but Prolog programmers are just the people to know all about that sort of thing.

The book, fortunately, is more readable than the spec. There was a big discussion during the ten-year standardization process about how formal the specification of the semantics of the language should be, and the informalists won. There is a formal semantic definition, but it's stashed away in an appendix, or "annex," as the authors call it. The language defined in this book is called Standard Prolog. Much of what was undefined or whose definedness was undefined in existing Prologs is defined here, but not everything. And one needed feature of Prolog is reserved for a future version of the spec: modules.

As a casual Prolog programmer and veteran spec reader, I'd say the book does a good job. Readers who have read specs for C or C++ may be puzzled, though, and think that the book's emphases are a little peculiar. That's a function of the peculiarities of the language, I think. Prolog is a pretty simple beast as computer languages go, but some of its elements are devious. A lot of pages in this book are spent spelling out certain key algorithms that Standard Prolog uses, particularly algorithms for substitution and resolution. Then there's the attention paid to writing programs that are provably portable, an important issue, to be sure, but a subtler one in Prolog than in many other languages.

But the existence of a standard for Prolog is more important than the strength or weaknesses of one book. Prolog is an unjustly neglected language that can, in modern implementation, be very efficient. And its programming style (in which once you have stated the problem you have written the program) cuts through the mere engineering that most programs are burdened with to focus on the problem itself.

The Past of Prolog

I try to include some history in this column every month. Here are a few facts about the history of Prolog, courtesy of History of Programming Languages, Thomas J. Bergin and Richard G. Gibson (ACM Press, 1996, ISBN 0-201-89502-1).

Prolog owes much to a seminal 1965 paper by Alan Robinson titled, "A machine-oriented logic based on the resolution principle." Prolog is a logic-oriented language based on the resolution principle. The language Prolog was conceived in 1970 by Alain Colmerauer and implemented by Colmerauer and Philippe Roussel. Colmerauer hoped to create a system for processing natural language, specifically French, and when that proved harder than he expected, he spun off the core of the effort as a general-purpose programming language.

The development work was done back in to '70s over 300-baud lines, a humbling reminder to those of us who think we can't function at fourteen dot four. Robert Kowalski made innumerable contributions, and the authors picked up some good ideas from computer science legend Robert Floyd. By the middle of 1973, the basic features of the language as we know it today had been invented.

Backtracking is an extremely important feature of Prolog, but Colmerauer preferred a more complete and computationally expensive method introduced by Floyd. Ultimately, backtracking won out on practical grounds. The cut operator in Prolog trims off branches of the search tree to speed execution. Initially, there were four cut operators, and their syntax, introduced by Colmerauer, was inspired by the work of Noam Chomsky. That last point wasn't particularly important, but I try to find an excuse to work Chomsky's name into the column every six months or so.