Dr. Dobb's Journal January 2002
Numerical computing typically involves the kind of stuff that goes on behind closed doors. Consequently, the impact of numerical computing is felt by many, but realized by few. Take weather prediction, for instance. From the color of a woolybear caterpillar's belly to the aches in our creaking bones, everyone likes to talk about the weather, even though we can't do much about it. How else can you explain the popularity of TV's The Weather Channel with its consistently high ratings and web site (http://www.weather.com/) that averages 300 million pageviews and 13 million unique users per month?
Weather prediction and numerical computing have gone hand-in-hand since before digital computers. It was nearly 100 years ago that Norway's Vilhelm Bjerknes, the father of numerical weather prediction, theorized that you could use mathematics to predict weather patterns if you had sufficient information about the state of the atmosphere at any point in time. Bjerknes had the theory down, but lacked the computing power to prove it. In 1922, British physicist Lewis Fry Richardson further applied mathematics to weather prediction, as described in his Weather Prediction by Numerical Process (Cambridge University Press, 1922), but again without computers.
As it turns out, computers weren't turned loose on the problem of numerical weather prediction until John von Neumann recognized the similarities between atmospheric fluid dynamics (that is, the weather) and the nuclear explosions he was researching during WWII. Von Neumann subsequently convinced the U.S. Navy, Air Force, and Weather Bureau to fund the Electronic Computer Project at Princeton University's Institute for Advanced Study (IAS). All else being equal, the IAS computer was a powerful machine, computing in 5 minutes a 24-hour forecast that had taken Jule Charney (a protégé of famed meteorologist Carl-Gustaf Rossby) 24 hours on the ENIAC in 1950. Shortly thereafter, the Royal Swedish Air Force Weather Service got into the real-time numerical weather forecasting act using a model developed by Rossby, who had left the University of Chicago for the University of Stockholm.
At von Neumann's urging, the Weather Bureau, Air Force, and Navy established the Joint Numerical Weather Prediction Unit (JNWPU) in 1954 under the direction of Dr. George Cressman. JNWPU's goal was straightforward: to prove whether numerical weather prediction was, in fact, feasible. The JNWPU's workhorse was a state-of-the-art IBM 701 computer that enabled researchers to begin distributing their first computer-prepared forecasts in mid 1955. It is important to remember that Cressman and his colleagues were meteorologists, not computer scientists. Nevertheless, they developed all their own programs for formatting and analyzing data to enable forecast calculations. Cressman, for instance, was credited with developing the "Cressman Analysis" (http://www.asp.ucar.edu/colloquium/1992/notes/part1/node119.html), which became a standard reference for correcting gridpoints. He also wrote an atmospheric forecasting program that, at least until a few years ago, was in use as a verification baseline.
By 1958, JNWPU had demonstrated that numerical weather prediction was workable: Timely and accurate numerical predictions were being generated on a regular basis. Furthermore, the group had put in place systems for both automatic analysis and automatic data handling. Its job done, JNWPU was split into three organizations: the National Meteorological Center, under the auspices of the National Weather Service; the Global Weather Center, under the Air Force; and the Fleet Numerical Oceanography Center, under the Navy. Cressman moved on to become director of the National Meteorological Center, where his research on worldwide winds and temperatures at different flight levels resulted in aviation-fuel savings that more than paid for the program. He later served as director of the National Weather Service from 1965 to 1978.
Without a doubt, numerical weather prediction has made strides that were unimaginable a few decades ago. Still, the 1950s and '60s must have been heady times indeed, what with the advent of computers and the breakthroughs they brought. Meteorologists today face a slew of different challenges, foremost among them simply assimilating and processing an unprecedented amount of data. That said, computing power has advanced to the point where meteorological computational models can now represent the atmosphere at higher resolutions than observing systems can provide initial data for. Advances such as these are becoming significant problems for meteorologists.
Of course, our ability to continue with scientific advances requires not only more and more powerful computers, but also the continued participation of dedicated scientists to use them. Certainly, that was the mindset of 1950-era meteorologists whose commitment to their calling was best summarized by Dr. Cressman: "In the briefest possible summary, I may say that I spent my professional life in pursuit of two main objectives: to learn how to integrate six simultaneous partial differential equations and to figure out what to do with the answers. My colleagues and I penetrated one of the ageless secrets of nature. It was fun, and I'd do something like it again if I had the chance." Gee, wouldn't you think that about says it all?
Jonathan Erickson
editor-in-chief
jerickson@ddj.com