Beautiful Math

Dr. Dobb's Journal June 2002

By Michael Swaine

Michael is editor-at-large for DDJ. He can be contacted at mike@swaine.com.

Mathematicians and scientists are becoming cultural heroes, the University of Wisconsin's Why Files (http://www.whyfiles.org/) concludes — on what looks to me like pretty slim evidence. Well, at least mathematicians are in the public eye. Earlier this year, as the Hollywood Academy Awards extravaganza drew near, controversy swirled about the movie A Beautiful Mind and its subject, mathematician John Forbes Nash, Jr. Strange rumors were being spread about Nash, apparently in a (clearly futile) attempt to influence the awards voting. But the truth about Nash was already strange enough, and as for the movie and the awards it won — well, the Academy of Motion Picture Arts and Sciences has its collective opinion, and I have mine. My opinion is that no movie dealing with mathematics has ever surpassed Donald Duck in Mathemagic Land, which changed my life.

I'm not kidding, or only partly kidding, anyway. I've always thought of math as fun — which is probably why I changed my major to psychology at that point in my academic career when math classes first started to get difficult. I might have stayed with math if Homer B. Tilton had been my teacher; consider this my plug for Tilton's always-entertaining work in Dr. Dobb's Math Power Newsletter, which you can find out about, read, or subscribe to at http://www.ddj.com/maillists/.

This sense of math as just good fun comes through in Sarah Flannery's book, In Code. Flannery is the young Irish mathematician who was much in the public eye (sure, let's call her a popular hero) when she won the 1998 Intel Award for Excellence in Science. Past winners in this 60-year-old competition, which was formerly sponsored by Westinghouse, include Nobel laureates, MacArthur Foundation fellows, and Fields medalists (that's the equivalent of the Nobel prize for mathematicians). Winners whose names should be familiar to any reader of this publication include Ted Hoff and Raymond Kurzweil. What really got Flannery in the news, though, was her claim to have come up with a completely new encryption algorithm that was better than RSA. More about that anon. Flannery did her programming using Mathematica, and coincidentally the creator of that program — himself a former child prodigy and a MacArthur fellow — has a new book out, too. That's right, Stephen Wolfram has finished writing his magnum opus, A New Kind of Science, which he says is going to revolutionize science. It should be in print by the time you read this. Wolfram could benefit from taking humility lessons from Steve Jobs, but I have no doubt that the book will be a must-read for the mathematically inclined, even at 1197 pages. (The good news: It has a lot of pictures. The bad news: They're probably going to be more technical than the text.) I'm also looking forward to the release of PythonCard for the Macintosh platform, which I probably will review. This month, I'll report on a conversation with its creator, Kevin Altis.

Also this month, I'll begin what I hope will be a continuing thread in this column for an indefinite time: a somewhat structured look at the ideas behind quantum computing.

A Dutiful Mind

Sarah Flannery's In Code (Workman Publishing, 2001; ISBN 0-7611-2384-9) is a good book to recommend to young readers, but since it is neither condescending nor trivial, it also has things to offer to math-obsessed adults. It was written "with David Flannery," the chief author's math teacher father, who helped with the math parts. That's most of the first half of the book.

Threaded through the math is an account of Flannery's background, pleasant to my way of thinking — growing up with a blackboard in the kitchen for working problems, math puzzlers given regularly to all five children from an early age. She tells, with pleasing frankness, how she got into the competition (several competitions, in fact) and won the prizes, how the press descended on her, and how she reacted. The math in the book begins with familiar puzzles, then moves into a discussion of modular arithmetic, Euclid's algorithm, simple crypto, elementary number theory, Fermat's Little Theorem, and one-way functions. This gives the background necessary (for less experienced readers) for the discussion of public key crypto and the RSA algorithm. An aside: The "A" in RSA is Leonard Adleman, who has moved on to other challenges — specifically DNA computing (from RSA to DNA, one might say, of Leonard A). The March 15 issue of Science carried the story of his DNA-based computer solving a serious computing problem. The big virtue of DNA computing, apparently, is massive parallelism; the big disadvantage is that the computer makes a lot of errors. (Massive parallelism and frequent errors; sounds a little too much like my work habits for comfort.) The bottom line seemed to be that if you couldn't lay your hands on an electronic digital computer, or at least a pocket calculator, then a DNA-based computer would be the next-best thing. There's another potential virtue of DNA computing, though, which I'll get back to later. End of aside.

Flannery's algorithm, which she calls the Cayley-Purser algorithm, gets presented in an appendix along with the RSA algorithm, for direct comparison. The big advantage of Cayley-Purser over RSA, Flannery thought initially, was that Cayley-Purser uses only modular matrix multiplication, while RSA uses modular matrix exponentiation. Actually, that's true enough, but it turns out that she didn't consider everything in developing the algorithm, and it has what is probably a devastating and unfixable security flaw. Oh, well.

She deals in the book with this scientific disappointment and with the aftermath of fame: the praise she received as some sort of selfless genius for not patenting the algorithm — which probably couldn't have been patented, the thrilling and unexpected phone call from Ron Rivest, the not-so-thrilling but equally unexpected offer to appear in a Pepsi ad, and the ultimate compliment — the Spice Girls saying that she had Girl Power.

Flannery turned down Pepsi, but admits she'd be willing to do a commercial for Mathematica.

As part of her prize winnings, Flannery got to go to Stockholm in December of 1999 to see that year's Nobel prize awards. These would, I guess, have included the prize to physicists Gerardus 't Hooft and Martinus J.G. Veltman for cleaning up the math in quantum physics.

The Essential Book

Four years earlier, it was John Nash who was making an appearance in Stockholm, when he accepted his Nobel prize for his groundbreaking work in game theory. Nash's early brilliance, his fundamental contributions to several areas of mathematics, his three dark decades of mental illness, and his triumphal recovery, culminating in receiving that Nobel prize, have all been documented in the Oscar-winning Ron Howard movie and in the book by Sylvia Nasar on which the movie was based. But Nash tells his own story, briefly and in his inimitable style, in The Essential John Nash, edited by Harold W. Kuhn and Sylvia Nasar (Princeton University Press, 2002; ISBN 0-691-09527-2). (As she prepared to report on her original work to Intel, Sarah Flannery drew inspiration from Eric Temple Bell's classic Men of Mathematics. In his brief autobiography in this book, Nash also cites Bell's book as an important childhood influence. He apparently read it at about the same age as Flannery, but 40 years earlier.)

Most of this book, though, is mathematics, and most of it is by John Nash. This is a collection of some of "the most important original contributions to mathematics of the twentieth century," according to one academic reviewer. I won't claim that this book is altogether a fun read, although it does contain a description of the game called "Nash," which Nash invented but which you might know under another name. But some of us like this kind of stuff.

Nash's writings included in the collection are five important papers on game theory, a memo on parallel computing written in 1954, and several other not-so-easy pieces. Nash's doctoral dissertation is included in photocopy form so that you can see his handwriting in the equations. All the writings display Nash's terse, dry style. You can also see in detail how Nash figured out what John Von Neumann couldn't — how to make sense of nonzero-sum games, using what are today called "Nash equilibria." There are pictures of Nash and other famous mathematicians and scientists, and there's commentary by some who knew him, but the bulk of the book is serious math. And it's more expensive than the more readable A Beautiful Mind. If I have now succeeded in scaring off everyone but those who loved that Donald Duck movie, I have done my job.

The Quantum Thread

Here is where I start what I hope will be a continuing thread in this column: exploring some of the ideas behind quantum computing. I've written about quantum computing before, but I hope to be more systematic in this series. I wouldn't be surprised if my sketchy physics education leads me into a mistake or two; the large number of you who know more than I do about quantum physics might consider it a challenge to find those errors and let me know about them. I also invite pointers to other sources on the quantum computing subjects I discuss here, such as this month's topic.

Quantum computing, like quantum physics generally, is rife with peculiarities. It is really very different from the Von Neumann architecture we've been programming on and the kind of programming that we've been doing on those Von Neumann architectures for the past half century or so. One of the strangest aspects of quantum computing is not really a uniquely quantum feature, though, and that's reversibility.

Take It Back

Physical processes are time reversible, but computations on a conventional digital computer are, in general, not. This is clearly the case at the lowest level of computer architecture: An AND gate has two inputs but only one output, so it is impossible to reconstruct the input from the output. In the execution of every AND, some information is destroyed.

A NOT gate, however, is logically reversible. It has one input and one output, and you can recover the input given the output. (Not only is a NOT gate reversible, but it actually executes the same function if its inputs and outputs are reversed: It's still a NOT if the bits flow through it west-to-east rather than east-to-west.)

Reversible computing is more than just computing with reversible gates, though. In reversible computing, the entire computation is actually done both forward and backward. It works like this: You run the inputs through the system, dump the outputs to a printer, and then feed the outputs back into the system at the back end. The system and the computation run backward, undoing the morning's work in the afternoon, so to speak, and outputs the original inputs at the front end. This sounds laughably inefficient: Not only is the whole computation executed twice (once in reverse), but the second computation is done after the desired result has been obtained and printed. What possible purpose could there be in computing in this manifestly silly way? Well, it turns out that this peculiar way of computing actually offers some nice benefits even for classical computing, and is absolutely required for quantum computing. One of these benefits is energy-free computation.

In 1949, John Von Neumann (him again) asserted that there was a minimum amount of energy required to perform a unitary act of computation, and he gave this minimum a precise value: kTlog2, which is about 2x1021 joules. As a result of landmark papers by Rolf Landauer in 1961 and Charles Bennett in 1973, it became clear that Von Neumann was wrong, that there is no minimum amount of energy required to compute, and that the only necessary expense of energy in computation comes with the erasing of information. If no information is erased — that is, if the computation is reversible — then no more energy need be expended in computing than is lost by a ball that rolls down one side of a valley and up the other in a friction-free environment.

And this is not just a theoretical result. Reversible computation can be realized in a physical system in which every energy-consuming action is paired with an equal energy-producing action, for no net energy loss. Reversible computing is possible, and where it can be employed, it can provide essentially energy-cost-free computing.

Avoiding Entanglements

One place where reversible computing not only can but must be employed is in quantum computing systems. The requirement of reversibility has to do with decoherence. Quantum information in quantum computers is maintained in quantum registers. To keep the computation coherent, these registers have to be kept isolated; otherwise, you get entanglement with the environment, and decoherence. Any heat dissipation from the system means that the quantum registers have not been kept isolated. Therefore, a workable quantum computer can't dissipate heat, its entropy has to remain constant, and all its state changes have to be adiabatic. This just means that all its computations have to be reversible. Reversibility is a necessary, but not sufficient, condition for any quantum computer, and for any quantum computation.

In quantum computers, the reversibility must be implemented right down to the level of gates. A quantum gate is similar to a classical gate. But to avoid destroying the superposition of states in the quantum registers, quantum gates are designed symmetrically, so that their input state can be derived from their output state. This is usually achieved by directly passing one or more of the inputs to the output stage. This is called a "controlled" gate because the signal that passes through unchanged can be seen as a control signal, affecting the other signals without being itself affected. But the reversibility goes all the way up, too. This leads to strange algorithms, like Shor's algorithm for quantum crypto, which I have discussed here before and about which I will have more to say in future installments in this series. Interestingly, in his 1973 paper, Bennett pointed to an extremely widespread, already-existing system that routinely performs its computations reversibly: DNA, or more precisely, DNA and RNA viewed as a computational system. This is the peg that Leonard Adleman has hung his hat on recently, which has the potential reversible-computing virtue of being able to do computations at an arbitrarily low energy cost. If it works, one might think of it as another kind of Free Software Foundation.

Python Dreams

With the discussion of awards earlier in the column, this might be an appropriate place to mention that the Free Software Foundation gave Python creator Guido van Rossum its fourth annual FSF Award for the Advancement of Free Software in February. That may be a clue that Python is something special. Kevin Altis thinks it is.

Kevin dropped by the other day. Kevin and his wife were traveling down Interstate 5 and came to Summer Jo's, mostly so Kevin could show me the current state of his latest project, PythonCard. I liked what I saw; in my opinion, the world can't have too many rapid application development tools, and I like the idea of Python as the scripting language for a highly visual development environment.

PythonCard, you see, is a software construction kit in the spirit of Apple's HyperCard, written in Python; using Python as its scripting language; developed for Linux, Windows, and the Mac OS; and distributed as open source under the Python license. The inestimable Dan Shafer is doing some documentation work for the project. The Mac version was, when we talked, possibly imminent. A visit to the SourceForge site will tell you the current status of all versions.

Why is Kevin doing this? Does he think he's going to get rich resurrecting HyperCard? No, he's doing it because he thinks it would be useful and fun and challenging, and because it's a tool he can see himself using. When he finished his demo, he talked with me about the epiphany he had when he really "got" open-source software. It's obvious, he said at one point, that this is the way to develop software. And it is obvious if you come at it from a science-influenced, open-systems, nonzero-sum viewpoint. Come to think of it, one could probably come up with a compelling defense of open-source software in terms of Nash equilibria.

DDJ