Dr. Dobb's Journal November, 2004
It's a strange thing, how central the notion of information is to modern physics. To cite just one example, something called the "Information Paradox" has been described as one of the fundamental puzzles in physics.
Information, which seemed to be merely the foundation of computer science, is emerging as something much deeper. In some interpretations, information is regarded as the very stuff of reality. That may be a little grandiose, but it seems to me to be the implication of the theories of Steven Wolfram and Ed Fredkin, discussed in past columns. Wolfram and Fredkin paint similar pictures of a universe that is a program, a cellular automaton running on a gigantic universal grid.
It's also the implication of much of modern scientific thought. We live in an age of information, or an age in which information is coming to be properly appreciated. In the biological sciences, the discovery of the double helixan information-coding mechanismchanged everything and redirected most research agendas. Quantum mechanics gives information a privileged role in the physical sciences. Philosophers talk about the cybernetic view of reality, which views life and nature as self-organized information flows. Philosophers of science talk less of theories as models of reality and more of theories as a means of processing observational information.
To physicists today, conservation of information seems to be as fundamental a belief as conservation of momentum. And so it was regarded as more than a little offensive to science when Stephen Hawking said that information could disappear without a trace when particles fell into a black hole.
The Information Paradox codifies a deep theoretical and empirically testable conflict between the general theory of relativity (GR) and quantum mechanics (QM).
The Information Paradox came to light after Stephen Hawking announced that black holes as some would put ithave no hair. That is, there is nothing about a black hole to measure except for its mass and rotation. The many possible histories that might have led to the existence of a particular black hole, the memories of all the different particles that might have fallen into it and lent it their mass (well, not lent, exactly: They won't be getting it back), are lost in the amnesiac featurelessness of the black hole. One black hole of a particular mass and rotation is identical to another.
In 1974, Hawking put a bit more of an edge on this amnesia of black holes, when he discovered Hawking radiation. Black holes, he said, emit radiation, thus decaying and ultimately disappearing over time. And they disappear without releasing any of the information contained in the particles that went into their makinglike a magician opening his hand to show that the coin you lent him has disappeared.
Quantum mechanics says this can't be. The QM principle of reversibility says that you have to be able to get that input information back out of any process, that you can time-reverse anything without information loss. QM said that Hawking was wrong, and Hawking said that his conclusion was an inescapable consequence of General Relativity.
So which was right, QM or GR?
In 1997, physicists Hawking, Kip S. Thorne, and John P. Preskill entered into a wager regarding the Information Paradox. (Thorne was the subject of a festschrift, or at least a collection of speeches on his 60th birthday, a couple of years ago, all collected in an interesting book called The Future of Spacetime, W.W. Norton & Co., 2002; ISBN 0393020223, from which I have quoted here in the past and which contains the text of a different Thorne-Hawking wager.) You can find the full text of the wager at http://www.quantumcloud .com/forum/viewtopic.php?t=323, but the crucial paragraphs state the two opposing positions as follows:
Information swallowed by a black hole is forever hidden from the outside universe and can never be revealed even as the black hole evaporates and completely disappears.
Hawking/Thorne
and
A mechanism for the information to be released by the evaporating black hole must and will be found.
Preskill
If the information swallowed by a black hole can't be retrieved, where does it go? Hawking realized from the start that this was a question that had to be addressed. He considered that the massive gravitation of a black hole might somehow overrule the laws of quantum mechanics, a sort of "power makes its own rules" idea; or that a black hole might open a door to another universe through which information could escape. This makes information conservation a property of the multiverse. The idea inspired a lot of science fiction writers, who will now have to look for a new one.
On July 21 of this year in Dublin, Hawking acknowledged that Preskill had won the bet.
Hawking's resolution of the Information Paradox runs something like this: The event horizon of a black hole contains quantum fluctuations. These fluctuations exactly encode the information that Hawking formerly thought was swallowed by the black hole. This information is captured and released by the event horizon.
Except not in its original form. This is not a process you'd undergo willingly unless you were really depressed, like a science fiction writer who'd built a career on the notion of gateways into other universes.
Anyway, if I've got it right, that's how Hawking sees the situation today. So the most famous living physicist, the guy who sits in Newton's chair at Cambridge (although, as he points out, it wasn't motorized in Newton's day), had to knuckle under to the primacy of information.
It has been said, but usually by computer scientists, not physicists, that information wants to be free. It now appears that information insists on it. Information isn't going to put up with being penned in a black hole for all eternity.
Hawking and Thorne have duly delivered to Preskill the encyclopedia that was the prize in the bet. But this is not the first Hawking or Thorne bet. They both seem to like to spice up the game of research with side bets, as though not finding Reality to be a sufficiently engaging adversary.
Hawking once bet Thorne and Preskill (both of whom are Caltech professors) that a naked singularitythat is, one not in the center of a black holecould not exist. He lost that one, too. Hawking and Thorne have bet on whether or not a black hole sits at the core of Cygnus x-1 and on various other matters. The wall opposite Thorne's office at Caltech was once decorated with framed wagers.
Maybe somebody should tell these guys about Gamblers' Anonymousor maybe the Long Bets Foundation. The Foundation (http://www.longbets.org/) was set up, like the LongNow Foundation (http://www.longnow.org/), to encourage long-range thinking. A couple of the bets you might want to get a little money down on (on one side or the other) are:
By 2025, at least 50 percent of all U.S. citizens residing within the United States will have some form of technology embedded in their bodies for the purpose of tracking and identification.
and
In 2012, 75 percent of all revenue for enterprise software companies will be from subscription fees rather than license fees.
Actually, the latter is listed as a prediction, not a bet. Apparently, nobody's ready to put real money down on web services yet.
Incidentally, if you were worried about what Stephen Hawking will do with all his free time now that the bet is resolved, the British magazine New Scientist has taken up the challenge. It ran a contest over the summer asking readers to come up with the next challenge for Hawking now that he's resolved the Information Paradox. And if that doesn't work out, Intel has a plan. Intel seems intent on replacing Hawking's computer every year about this time, so we know what he'll be doing for part of this fall: installing drivers and screen savers.
Ed Yourdon made a bet 12 years ago, a bet that he lost rather spectacularly. In his 1992 book Decline and Fall of the American Programmer, Yourdon predicted that U.S. programmers would "suffer the fate of the dodo bird by the end of the decade."
In fact, during the decade to which he referred, the U.S. programmer job market exploded in the dot-com boom, a period in which programmers were in high demand and "college graduates with degrees in English literature were negotiating six-figure salaries at dot-com companies and demanding sports cars as a signing bonus."
Both quotes are Yourdon's, from the crow-eating introduction to his new book, Outsource: Competition in the Global Productivity Race (Prentice Hall PTR, 2004; ISBN 013147571). This book is not merely the former book reissued and updated, but a new argument, albeit from the same perspective, with new facts and figures. And this time, maybe you're not doomed. Only endangered.
(Although the title just uses the term "outsource," what Yourdon is concerned about specifically is offshore outsourcing, which he sees as sending U.S. programming jobs to India, China, Russia, and the Philippines, among other countries.)
Yourdon's 2004 premise is that his 1992 premise may not have been so much mistaken as ahead of its time, and much of the new book weighs in as evidence of that. Those who hooted at him for his former bad bet are invited to size up his latest sizing-up of the state of the American programmer, and I'm guessing that the current book will not be the fizzle that the other was. I was one of the hooters (there's a sentence I never thought I'd write), and I'm prepared to say that much of the current one seems on target to me.
Yourdon certainly has thought long about the issues he writes about, and offers some sage insights. For example, he points out the importance of FedEx and UPS to the information revolution, and evaluates how ready Poland, say, is to start taking part in the revolution, based on its package-delivery infrastructure.
The book lays out the current situation for American programmers, plots the trends, and makes recommendations, for the individual trying to protect the current job, for the U.S. government, and for the programming community as a whole. And for companies considering outsourcing, he has advice, too. Like don't outsource what you can't manage in-house, don't outsource core competencies, and give your current employees a chance to compete.
The subtitle of the book tips his hand and shows what he thinks is the real solution for American programmers so they don't go the way of the dodo bird: productivity.
Yourdon is fully aware that programmers are not the same as steel workers. As programming project managers know, there are enormous differences in the productive output of different programmers. If U.S. programmers are sufficiently productive, this can offset the higher wages they require compared to programmers in some other countries where the lower wages are the biggest argument in favor of outsourcing.
Of course, there are different kinds of IT jobs, and the outsourcing of some IT jobs, like call-center employees, is already a done deal. But you're not in that job market.
Yourdon makes what seem to me to be sensible recommendations regarding government policies: Rig the game to encourage what we want and discourage what we don't. He points out we're a nation of immigrants, but that we shouldn't become a nation of temporary visitors. Don't encourage people to come and take jobs or get an education and then take all the benefits home. Encourage people, instead, to bring the family, stay, and become Americans.
Of course, there's this: Russians can read books, too. And this: Yourdon's last book was written in the doldrums of a recession, just before a long period of prosperity. If the current one is being written in the same economic climate, there is at least the possibility that the current book will be subject to some such surprise as the one that derailed the former book. Being not even an amateur economist, I won't try to guess about that.
The Long Bets Foundation bettors are not afraid to weigh in on the issue, though. Esther Dyson has bet Bill Campbell that both the Wall Street Journal and the The New York Times will have named Russia the world leader in software development within the next eight years.
I think that Yourdon makes sense about the risks of outsourcing and the actions that can be taken to protect American programmers' jobs. This time around, I'm betting (but not actual money) on Yourdon.
If Yourdon is right, maybe what we need is a new Fred Terman. Fred Terman at Stanford: Building a Discipline, a University, and Silicon Valley, by C. Stewart Gillmor (Stanford University Press, 2004; ISBN 0804749140) is a new biography of the man usually referred to as the "Father of Silicon Valley."
Terman's remarkable career at Stanford had several major achievements in addition to his Silicon Valley paternity, all documented here.
Terman's own father, Lewis Terman, is more famous in some circles for being the psychologist who invented the Stanford-Binet IQ test and pretty much single handedly introduced IQ testing in America. Lewis has been described (by an admirer, yet) as a cold-hearted elitist and a believer in eugenics and forced sterilization for those deemed mentally defective. He only partly recanted these beliefs later in life. He has also been described as especially interested in children of "superior intellect" and as being devoted to the young "gifted" subjects of his best-known research, a longitudinal study of intelligence and achievement.
I'd like to have seen more about Lewis Terman and his views in the book, because there are reasons to think that the father's views had a lot to do with how son Fred developed. But that's not Gillmor's focus. From Fred Terman's childhood, we instead get the stories of a precocious youngster learning about practical electronics in the early days of radio experimentation, and I have to admit that it is more entertaining reading.
Interestingly, Terman was, Gillmor says, "constant in his belief that quality could be quantified" while the short-term hope of the programmer worried that his or her job will be outsourced is probably that it can't be.
Terman was the first Ph.D. student of Vannevar Bush at MIT. Bush, of course, was one of the half-dozen people who could claim to have invented the computer, although in Bush's case, you had to allow that to include analog machines. He was later the science advisor to FDR during World War II, when that meant advising about Eniac and the atomic bomb.
Terman studied math with Norbert Wiener, the genius professor who saw great importance in the study of information more clearly than almost anyone and laid one set of foundations for computer science in his work on what he called "cybernetics."
When he got to the point where he was mentoring students, Terman saw the promise in Bill Hewlett and David Packard, and guided and encouraged them in their studies and later in their business. The Terman model of close connection with student-launched tech businesses was cloned elsewhere, but nowhere as successfully as in Palo Alto and environs, where it led to the development of Silicon Valley.
Gillmor's fat (600-plus page) book is a dense biography of an academic administrator, but it does have its moments of interest to anyone who is curious about Stanford University, Silicon Valley's roots, or Fred Terman himself.
And if increasing productivity doesn't save the American programmer and there is no new Fred Terman or Tim Berners-Lee or Marc Andreessen prepared to reignite Silicon Valley or kick off a revolution, well, everything is about information these days and programmers are information specialists. If you can't turn programming skills into a research position in quantum physics, maybe it can get you a job in a genetics lab.
DDJ