Magic

Dr. Dobb's Journal November 1999

By Michael Swaine

Michael is editor-at-large for DDJ. He can be contacted at mswaine@swaine.com.

Any sufficiently advanced technology, Arthur C. Clarke said, looks like magic. The rate of technological advance, the loose interpreters of Gordon Moore say, doubles every 18 months. Multiply Clarke by Moore and the conclusion is inescapable -- a day will come when current technology looks like magic.

The premise of this month's column is that that day is upon us. The magic moment was reached in physics much earlier, of course. It came early in the century when, trusting their math and their data more than their common sense, physicists accepted that the fundamental nature of the world is not only different than anyone thought but different than anyone can think. The quantum moment.

Technology is humbler than quantum physics, though, and only occasionally does it take a leap so large that the mass of people can't see the connection between the new technology and older, accepted (even if not really understood) technology. The telephone or telegraph may have been one such magic technology, but radio was not -- it was originally known as "the wireless," defined in terms of existing technology. It's like something you already know, but with waves in place of wires. Television was radio with pictures. Movies were moving pictures; that is, defined in terms of photography. New technologies may be startling when you first see them, amazing that whatever it is could even be done, but usually they are understood as extensions of known technologies. Some technologies, like lasers, seem to come from nowhere, and seem a bit more magical.

Now, though, the pace has picked up. Computers are the reason. Children who in a past era would have been taught to use particular tools are now being taught how to build tools. Steve Wozniak has been quoted as saying that computer science is the most important discipline that children can be taught. As a computer scientist, one is a little embarrassed by such grandiose claims, but is there any more powerful tool to place in anyone's hands than a knowledge of programming?

It's a tool that's being handed to a lot of kids. This has been happening, at an increasing pace, for long enough for generations of programmers to have made contributions to the advancement of technology. No wonder the pace has picked up. No wonder it's starting to look like magic. But that's my premise, and I haven't supported it yet. Maybe the following observations will prove the point.

Attaining Invisibility

The goal of MIT's Oxygen project (described in Scientific American, August 1999, and http://www.lcs.mit.edu/anniv/) is to push computer and information technology beyond what passes for ease-of-use today, making it not only accessible to everyone, but richly exploitable by everyone. Michael Dertouzos, director of MIT's Laboratory for Computer Science, envisions a world in which the technology we now access by typing and staring at screens in boxes on desks will be accessed via speech directed to the walls or to a tricorder-like device that everyone will carry. Thirty MIT faculty are involved in the five-year project to develop Oxygen, which Dertouzos hopes will become as pervasive as the air. (In the speech in which he laid out the Oxygen project, Dertouzos predicted a 300-fold increase in human productivity in the next century. In the Scientific American article, this has been scaled back to a modest 300 percent increase. I'm pretty sure Dertouzos meant 300-fold, or 30,000 percent.)

Oxygen comprises eight technologies: a handheld device called Handy 21, an in-wall or in-car-trunk device called Enviro 21, a network for collaboration called Net 21, speech recognition, intelligent information retrieval, tools that allow anyone to script the devices and automate repetitive tasks, intelligent and informed tools for collaboration, and the ability to customize the technology to your own needs. One of the features of the current technological landscape that is not included in the vision is shrinkwrapped software. Apps will go away, Dertouzos predicts.

Dertouzos's eight technologies are technologies that we would probably all agree are desirable, and that are being worked on in places other than MIT already. But what makes them special, and makes the Oxygen project interesting, is Dertouzos's claim that these are the only important technologies. Actually, it's just the last five technologies that he makes this claim about. He says:

The five technologies of speech (and other perceptual capabilities), knowledge access, automation, collaboration and customization are the only new kids on the block. Out of the thousands of things that we can imagine doing in the new world of information, these five are the foundations on which any new activities that help us do more by doing less will be built.

Doing more by doing less. It's Dertouzos's mantra. And when it works, it's magic.

Quicker Than the Eye

One technology crucial to the Oxygen project is the Raw microchip, the project of Anant Agarwal of MIT. The idea is to get rid of the traditional interface to the microprocessor -- the instruction set -- and let the software have access to the raw hardware.

The Raw chip is an array of identical tiles, each containing memory and function units, and each connected to its neighbors by wires. Excess logic gates at the junction between tile and wire make up a programmable switch. The (software) compiler can program those switches to direct the flow of signals through the chip. Among other things, the software can create registers as needed, storing data in a convenient spot to break up long wires that would incur long delays.

Agarwal's group has built a Raw simulator and written a compiler to take advantage of it. In a test application, the compiler achieved a 10-fold speed increase over any existing conventional processor. Students tweaked the code for a further 10-fold increase. Agarwal hopes to see compilers developed that get that full 100-fold speedup when the chip becomes a real device.

Doing the Impossible

...trying to find a computer simulation of physics, seems to me to be an excellent program to follow out...and I'm not happy with all the analyses that go with just the classical theory, because NATURE ISN'T CLASSICAL, dammit, and if you want to make a simulation of nature, you'd better MAKE IT QUANTUM MECHANICAL, and by golly it's a wonderful problem because it doesn't look so easy.

-- Richard Feynman (1981) (International Journal of Theoretical Physics, v.21, p. 486)

The open source movement invades all corners of computer science. Now, open source has moved into quantum space with the open source quantum computing project. The goal of the project is to simulate a quantum computer on a conventional one. The web site for the project (http:// www.openqubit.org/) has become a portal for quantum computing work.

Actual code has been released recently. There are now in fact several quantum computer simulators, and the field has become so crowded that it requires reviewers. A review of all existing simulations at press time has been posted at http://www.dcs.ex.ac.uk/ ~jwallace/simrevab.htm. Quantum computing -- quick refresher here -- promises to surpass the theoretical limits of conventional computing by taking advantage of the quantum nature of reality. In a conventional computer, bits take on one of two discrete states.

In a quantum computer, quantum bit -- or qubits (pronounced "cue bits") -- exist in a quantum superposition of states when not being measured (measurement collapses each of them to some discrete state).

This indeterminacy is the source of the computational power of quantum computers, which can be seen by considering memory registers built of qubits. A quantum register of size n requires 2 to the n complex numbers to represent its state, so it can store 2 to the n complex values, while a conventional register of the same size requires (and can store) just n integers. The storage capacity of quantum registers thus increases exponentially with size. That's where the magic lies.

Problems that can be solved in polynomial time are generally thought of as being tractable, while problems that can only be solved in exponential time are thought of as intractable. Quantum computing, by virtue of this exponential expansion of storage capacity, promises to make the intractable tractable.

In practice, this is exploited through quantum parallelism. The idea is to apply a function to a quantum register, thereby exploiting the exponential capacity of the register to get a lot of calculations simultaneously. Exactly how you exploit this capacity has been a bit of a puzzle. The problem is that you are essentially looking at the outputs corresponding to all possible combinations of inputs. The trick, then, is to make the output of all the possible inputs turn out to be something interesting, and useful. Peter Shor and L.K Grover, among others, have come up with algorithms that do just that.

Is This Your Number?

It was Shor's algorithm for factoring large numbers, developed at Bell Labs in 1994, that brought about a significant shift in the perceived seriousness of the field of quantum computing. Shor's algorithm, if implemented on a quantum computer, would crack many encryption schemes in polynomial time, making them worthless.

This is what most of the quantum simulation work has focused on -- simulating Shor's quantum algorithm for factoring large numbers.

It turns out that the hard part of factoring a large number can be reduced to finding the period of a particular periodic function involving the number. Shor's algorithm finds the period of that function.

The number to be factored is n; x is an integer coprime to n, the period of the function is represented by r, and the function itself is x to the a mod n.

The algorithm uses two quantum registers. It loads register 1 with an equally weighted superposition of all integers from 0 to q-1, where q is a power of 2 between n squared and n squared times 2. It calls the contents of register 1 a. Then it applies the transformation x to the a mod n to register 1 and stores the result -- a superposition of all outputs -- in register 2. Due to quantum parallelism this takes only one step. It then measures the second register, observing some value k. The superposition of output values has collapsed into the value k. Quantum magic also has the effect of forcing the contents of register 1 into a state consistent with the value measured in register 2. This can still be a superposition of values, but the values all have to satisfy x to the a mod n equals k.

The algorithm then computes the discrete Fourier transform on register 1. Again, this is a one-step operation, thanks to the magic of quantum parallelism. This has the effect of placing register 1 in a state where observing its value gives a number that has a high probability of being a tractable function of r, the desired period of the function. Anyway, that's what they tell me.

A conventional computer then uses r to finish the factorization. Please don't build a quantum computer and try to run Shor's algorithm on it based on my description; I undoubtedly got something wrong. I just wanted to give you the flavor of it, which is all I was able to extract from the primary sources.

Meanwhile, though, research into actually building a quantum computer, not just a simulation, slogs along at various research sites throughout the world.

Some labs have actually created working qubits, which is no mean accomplishment. Yasunobu Nakamura and his colleagues at NEC Fundamental Research Labs in Tsukuba, Japan, have come the farthest to date in constructing a device capable of remaining in the necessary quantum state long enough to do some computation (all of two nanoseconds). Their device uses a superconducting quantum dot held at a temperature of 0.03K, so don't look for the home hobbyist kit version any time soon. The next step in the gargantuan bottom-up process of building a quantum computer is to figure out how to wire up a few qubits to make a logic gate. And as the Raw microchip project shows, you can do a lot with logic gates.

Sun's Latest Trick

Sun has released details of its MAJC (pronounced "magic," which stands for "Multiprocessor Architecture for Java Computing") chip architecture. As I understand it, MAJC was developed to move and manipulate nontext data efficiently and to promote Java, although that's not exactly how they put it in the presentations, which describe such MAJCal features as:

DDJ


Copyright © 1999, Dr. Dobb's Journal