Two For the Price of One—Maybe

Dr. Dobb's Journal May, 2005


Dual-core processors, those devices that effectively give you two processors on a single CPU, are the talk of the town. The reason why is that dual-core processors can process twice as much data per clock, while handling more threads. In other words, when compared to their single-core cousins, multicore processors can run at slower speeds and lower voltages, but still deliver higher performance. That's the promise anyway.

To date, both AMD and Intel have announced dual-core offerings. At LinuxWorld, for instance, AMD demonstrated its dual-core AMD Opteron processors running systems from Cray, HP, and Sun. For the time being, AMD is laying claim as the only vendor to publicly demonstrate x86 dual-core server solutions. For its part, Intel's plans revolve around its dual-core Pentium Extreme Edition, which includes Hyper-Threading Technology (HTT) that processes four threads simultaneously, and its nonHTT Pentium D processor. Dual-core architectures were in the spotlight at the recent Intel Developers Forum. Not to be left out, Analog Devices has extended its Blackfin processor family with the dual-core ADSP-BF561. Then there's the ARM PrimeXsys dual-core, the Texas Instruments OMAP5910 dual-core processor, the AtMel dual-core DIOPSIS 740 DSP, and so on. You get the picture.

So far, most dual-core activity has been on the server side. IBM, for instance, has been offering dual-core implementations of its Power4 and Power5 for a couple of years, and both AMD and Intel have targeted the server market to now. Still, they all have designs on the desktop. AMD recently showed off its "Toledo," which sports two Athlon 64 processors on the same chip, and which specifically targets desktop systems. Likewise Intel's "Smithfield" targets the desktop, while its yet to be released "Yonah" is designed for laptops. In short, it appears that Intel, AMD, and other vendors will be moving entire processor lines to multicore architectures over the next few years. According to Intel, 70 percent of all its desktop and mobile processors, and 85 percent of all server processors shipped will be dual core by the end of 2006. Moreover, Intel plans on having devices with four cores running up to eight threads each by the end of the decade.

This is exciting stuff for all computer users, but especially software developers. Well, maybe "exciting" isn't the right word. "Challenging" might be a better way to describe what lays ahead, although some might say "a pain in the keister" is a better fit. Nevertheless, as Herb Sutter pointed out in "A Fundamental Turn Toward Concurrency in Software" (DDJ, March 2005) and Craig Szydlowski in this month's "Multithreaded Technology & Multicore Processors," we're about to enter a new world in which terms like "multithreaded," "concurrency," and "parallelism" are the norm, rather than the exception. Dealing with multithreaded applications that run on multicore machines will likely require new tools, new techniques, and a new way of thinking.

That's the good news. On the flip side, all kinds of new nontechnical issues come into play. Take software licensing, for instance. According to Craig Szydlowski, Microsoft says it intends on licensing software on a per processor package basis. In other words, one license for one processor, no matter how many cores are in the CPU. This will surely seed the market and create demand. Other companies haven't made this commitment to the future, however. Although it hasn't made a formal statement, Oracle seems to be leaning towards licensing its software on a per core basis. A dual-core processor would require two licenses, even though it is running on a single machine. Likewise, BEA hasn't made a firm commitment, although it reportedly is considering a middle-of-the-road approach, whereby licensing might be on a per core basis, but at 1.25 times the cost of a single-core processor, instead of two times the price.

The bottom line is that multicore computing is a question of "when," not "if." When will the processors be affordable enough for widespread adoption? When will the development tools be there to build software? And when will the applications be there to take advantage of the powerful capabilities of multicore machines? As usual in the computer industry, the answer is "real soon now."

¨ ¨ ¨

On another note, we had multiwinners in our recent Mars Rescue Mission Challenge (http://www.frank-buss.de/marsrescue/). Please join me in congratulating Kevin Shepherd, Randy Sargent, David Finch, Matthew Ogilvie, Jeremie Allard, Stefan Ram, and Allen Noe. They each received a Dr. Dobb's CD-ROM Release 16 for their efforts. And a special thanks to Frank Buß for conceiving, coordinating, and judging the challenge.