Dr. Dobb's Journal September 2007

Reinventing Computing

By Jonathan Erickson


When you prattle on as much as I do, it's nice to occasionally get some confirmation that you might be on the right track. A case in point: Recall that I've mentioned a time or two the need for programmers to get up to speed on parallel programming for multicore architectures.

As it turns out, I'm not alone in this. And what's gratifying is that someone a lot smarter than me, who knows a lot more about this stuff than I do, seems to agree. Moreover, he says it so much more elegantly than I.

In a keynote address at the International Supercomputing Conference, Microsoft Technical Fellow Burton Smith, who was chief scientist at Cray before moving to Redmond, talked about new approaches to software development, where everyday computer programs must be able to execute in parallel on multiple microprocessor cores. Smith referred to this as "general-purpose parallel computing" where developers will build more powerful, humanistic software applications that incorporate speech, conversation, rich visualization, and anticipatory execution of tasks. But for this to happen, said Smith, "it's vital that software and hardware adapt to new models of computing."

These new models, he said, would require a new way of thinking, thereby mandating what Smith called a "reinvention of computing" as we know it. Not quite grokking this and curious as to what he meant, I asked. Here's what he had to say:

"Moore's Law will continue to improve transistor cost and speed, but single-processor performance will no longer keep pace. There are two possible future scenarios: Either computers get a lot cheaper but not much faster, or we use parallel computing to sustain continued performance improvement. In the first case, computing becomes a 'mature' industry, and hardware and software become commodities. In the second, consumers will continue to enjoy the benefits of performance improvements, but successful software and hardware providers will have to embrace parallelism to differentiate themselves and compete."

Okay, so what does this mean for software developers? What will be the biggest challenge in moving to this reinvented world?

"Our field must reinvent not only computing but also the computing profession," Smith said. "The universities will do much of the work, but companies like Microsoft will have to help, for example, by educating the developer community about this new way of thinking. We will also make it much easier for developers to write parallel software.

"Whatever the mainstream programming model of the future is, it must enable free interoperability of shared memory and message passing. Shared memory is needed to enable high-bandwidth fine-grain communication and automatic load balancing, but the increasing importance of web-based services demands frequent message-based communication between clients and servers via the Internet.

"If we can't change our standard mode of operation to meet the challenge of manycore computing, the computing world at large will suffer the commoditization scenario I previously mentioned."

And what will this reinvented world look like? "There is a diversity of programming languages in use today; consider the spectrum represented by C++, C#, Visual Basic, SQL, Python, and Matlab. The parallel computing world will be at least as diverse, and will have the same needs for language interoperability and type compatibility that we have today," Smith said.

But what I learned didn't just stop there, as Dr. Smith also pointed me to The Landscape of Parallel Computing Research: A View From Berkeley (view.eecs.berkeley.edu), a most interesting website that focuses on general-purpose parallel computing. And I learned something right away—I've been misstating nomenclature in a splitting-hair kind of way, at least when it comes to emerging parallel architectures. While they seem synonymous to me, there's a difference between "multicore" and "manycore." Multicore programming models seem to be relegated to two to 32 processors, while manycore systems have thousands of processors.

Of course, Burton Smith probably already knew this. Like I said at the outset, he's a lot smarter and knows a lot more about this stuff than me.

Jonathan Erickson

Editor-in-Chief

jerickson@ddj.com