This month we commemorate that fateful day just 20 years ago when Harvard freshman Bill Gates walked into the historic Aitken Computation Laboratory to take his first college course in programming. Years later, the director of the lab would remember Bill this way:
He was a hell of a good programmer. In terms of being a pain in the ass, he was second in my whole career here. He's an obnoxious human being_. He'd put people down when it was not necessary, and just generally not be a pleasant fellow to have around the place. (Thomas Cheatham, in Gates, by Stephen Manes and Paul Andrews, Doubleday, 1994.)
On this anniversary of Bill Gates's official entry into programmerhood, it seems appropriate to reflect on what it means to program. And on what kind of person you have to be to be good at it.
I'm going to make two assumptions about you, which I'll confess right here: First, I assume that you are a programmer. Unless your reading habits run to the masochistic, I believe I am on safe ground here; second, I assume that you are now, or have been at some time, involved in some way in the design or development of a programming tool. This is a riskier assumption, but the odds on it are good. The phenomenon of tool user as tool maker is not at all unusual in the profession of programming.
Well, it may not be unusual in programming, but it does make programming unusual among professions. Most workers do not create their own tools and work environments; carpenters don't make saws, doctors don't make X-ray machines, and bus drivers don't make buses. For that, they are at the mercy of other professions or trades. But programmers do make programming tools.
This distinction is, I suggest, crucially important, because it makes programming uniquely capable of self-definition. Or self-redefinition. The ability to change the tools and environment of programming is the ability to change fundamentally the nature of the enterprise. Programmers can redefine what it means to be a programmer. Programming has, as it were, the power to rewrite its own genetic code.
Surely that is why programming has changed so radically over the 20 years since Bill Gates enrolled at Harvard. Back then, the typical edit-and-test cycle involved wrestling with a keypunch machine; assembling your deck of punched cards in a box, rubber-banding it, and handing it to an operator; and waiting in front of a wall of bins for your deck and printout to come back.
We've come a long way, and we've done it by tugging on our own bootstraps.
Well, you have. By your bootstraps. Technically, my profession is writing. And as a writer, I now inform you that we need a note of dramatic tension here. Here it is: This ability to lift ourselves (okay, yourselves) by the old bootstraps may disappear one day.
If that happens, the villain would be specialization.
Of course, programming already encompasses specialties. Corporate-database programming and commercial-application development and embedded-systems design all have their own goals and methods and views of the world. These are examples of horizontal specialization, but there is also vertical specialization. The most complex applications today could not be built with the lowest-level programming tools. We are already at the point where higher strata of programming use tools developed at lower strata.
It's not hard to imagine this vertical specialization increasing to the point where computer-science and engineering students would train for a specific stratum of software development, each stratum having its own courses of study and its own paradigms, methods, vocabulary, and tools. In such a scenario, tools would be black boxes supplied by incomprehensible wizards working in what would effectively be a different discipline; moving from one stratum to another would be about as easy and as likely as a physicist going back to school to study biochemistry.
Actually, you might consider this scenario highly desirable; but it does have the drawback of making the various specialties of programming as dependent on other specialties for their tools as lawyers and dentists are today.
Well, it's just a scenario. We're not there yet. We may never get there. I've heard it argued that this is pretty much the goal of object-oriented programming, in which case it's probably not going to happen any time soon. In any case, we still can define our own tools, and we should appreciate this ability while we've got it. We should be ready to challenge our basic assumptions.
It's not easy. Programmers are as susceptible as anyone else to the blinders of the task at hand. The unconscious assumption is that the job is really all about fixing the problems that we've created for ourselves; that the way to the other side of this wall must surely be through the wall.
All professions are susceptible to these blinders, but it matters more in programming because the potential to advance the state of the programming art is so much greater than, say, that of the bricklaying art.
I am writing myself into a corner here. Where I'm heading, obviously, is to a stirring exhortation to question all your assumptions, to look with an innocent eye at what you do and how you do it.
And I do make that exhortation. It's important to question the basic assumptions about what programming is.
Having made that exhortation, though, where do I go from here? After telling an audience to question assumptions and think for themselves, the only sensible thing for the speaker to do is shut up. But I don't think Jon will let me cut off the column here.
The only alternative is to relate an amusing anecdote.
You may have heard the story of Steve Dompier and the Altair's first recital. It has been variously related.
It was in the spring of 1975, a year after Bill Gates began his formal academic study of programming. By this time Bill was calling himself the "President of Micro-soft" and was claiming to have an implementation of Basic for the Altair. This Altair was a computer, or so said its manufacturer, a hobby electronics company in Albuquerque, New Mexico named "MITS." There was a mystery or a miracle surrounding this Altair, because it included an 8080 microprocessor and, according to an article in the January Popular Electronics, sold for roughly Intel's quantity-one price for the 8080: $397. Clearly, these guys must have got a very good deal on the chips.
Steve Dompier was living in Berkeley, California at the time. What he was doing doesn't matter, since the minute he read the Popular Electronics article, he became a man obsessed.
Dompier sent off a check for $397 and waited. He did not wait patiently. When his Altair didn't arrive soon enough to satisfy him, he bought an airplane ticket and flew to Albuquerque to pick it up. The folks at MITS were surprised to see him. He didn't get his computer then, though it arrived in the mail shortly thereafter. Or more correctly, a box of parts arrived. "I received my Altair 8800 in the mail at 10 a.m.," Dompier said, "and 30 hours later it was up and running with only one bug in the memory!"
He then faced a problem: what to do with the thing. For I/O, all it had were toggle switches and blinking lights. There was no software (Micro-soft hadn't delivered yet).
There are people who will buy a tool without knowing what they are going to do with it. Who will fly 2000 miles to pick up a $397 toy. And there are people who won't. Dompier was the first kind.
In April, Dompier showed up at the Peninsula School in Menlo Park, where the Homebrew Computer Club held its meetings. The club hadn't had many meetings: Dompier's was the second Altair that any of them had seen. And none of them had seen anything else you could call a homebrew computer. Turnout for the meeting was terrific. Dompier was going to demonstrate his Altair.
It took some time to set up the machine, and then Dompier started programming it. The audience waited as he flipped toggle switches to enter his program into RAM. All went well, until halfway through the process, someone tripped over the extension cord. The blinking lights went out, the Altair went dead, and Dompier sighed and started over. He had set a portable radio on the table, but didn't think to tune it to a station to entertain the crowd while he flipped switches.
Finally he finished loading and ran the code. Immediately, the radio, sitting next to the Altair, began to buzz with static. The unshielded Altair was putting out so much RFI that the radio was buzzing in time with the loops in Dompier's program. It buzzed Lennon and McCartney's "Fool on the Hill." According to legend, when it finished that it buzzed "Daisy," also known as "Bicycle Built for Two," as an encore.
The crowd went wild. Dompier had written a set of empty loops whose only purpose was to play music in the static the Altair generated on a portable radio left within RFI range.
All right, you may have heard this story. But you probably haven't heard the moral. Here it is.
When he got his Altair home and put it together, Dompier confronted an odd problem: the absence of a problem. He couldn't think of anything useful for his Altair to do. And he dealt with this problem in the most direct way imaginable: He wrote a program that did nothing. His program was nothing but a set of empty loops, doing quite literally nothing.
It was all side effect.
Now that is an interesting programming paradigm.
Is it a useful one? Ask Alain Colmerauer. He invented a programming tool, the language Prolog, that is all side effect, a language in which you can inform the computer that two plus two equals four but you can't tell it to add.
As I said, an interesting paradigm.
I do not say that Colmerauer knew about the Altair's first recital when he invented Prolog, particularly since he invented it in 1972, three years before Dompier got his Altair.
I do say that the story shows that you can sometimes accomplish things you might have thought were impossible by throwing out some of your most cherished assumptions--such as the assumption that a program must do something. Or do something useful. Science-fiction writer Algis Budrys points out that the purpose of feet is to walk and run and the purpose of painting is to make signs. Ballet and watercolors are obviously useless.
All art springs from the nonobvious use of tools.
The story also gives us one data point regarding what sort of person you have to be to be successful in software: obsessive, imaginative, and impractical. Maybe we should get another data point. And have another anecdote.
The claim can be made that programmers have been around longer than computers.
Exactly the opposite claim can be made, too, but the claim that programmers have been around longer than computers is more interesting than its opposite. I want to make this claim, and I will defend it, but first I need to tell you about Byron's daughter.
Augusta Ada Byron (1815--1852) was the daughter of a romantic poet and a (woman) mathematician. All of her short life she worried about resolving the poetic and mathematical sides of her nature. Then there were the dark rumors about her father's sexual proclivities, a certain coolness in the way of her mother's love, her frail health. She doesn't seem to have put a lot of effort into being a mother or a wife. She did have a remarkable social circle, including Charles Darwin, Augustus De Morgan, Charles Dickens, Michael Faraday, and Charles Wheatstone.
And, of course, Charles Babbage.
Babbage was designing a computer. With the hindsight of history we know that that is exactly what he was doing and that he had it right. At the time, though, it was hard for him to get across just what it was that he was doing. It was not that his contemporaries failed to grasp the idea of a machine to perform mental operations; calculators and other devices were the intellectual vogue. Babbage's contemporaries, including those close to him, probably thought they got it. That was his problem. A computer is not just a calculator, but that's a hard thing to explain to novices even today. And in Babbage's time, everyone was a novice.
Babbage had got the idea across to a few people; one was an Italian engineer named L.F. Menabrea, who had written a paper explaining the device, publishing it in French in a Swiss journal in 1842. This didn't help Babbage, who was trying to impress potential capital investors, all English.
Another person who got it was Ada. She not only understood it, she translated Menabrea's paper, adding greatly to it in the process.
And she wrote programs for the Analytical Engine. Babbage and Ada agreed that, to get across what the Analytical Engine could do, they had to supply some examples of its operation. Sample programs, we would say today. Ada wrote them: Chiefly, they calculated tables of numbers, as would the first real computers, built nearly a century later.
So the argument that Ada was the first programmer is simple: She was a programmer, and no computer ever existed during her lifetime. Babbage never built his Analytical Engine.
Ada programmed to the spec, of course. She hand-coded on paper. This is nothing; we've all done it. Hand-compiling is often recommended as a good discipline for programmers. But Ada was doing something harder than that.
Put yourself in her place. Ada had never seen a computer. Nobody had ever seen such a thing. She had, however, imagined one. That was the machine she programmed.
What an intellectual challenge! To envision what a computer would be like, and to write programs for it! Of course, that could never be done today; now that computers exist, such a feat of disciplined imagination could never be undertaken. Or could it?
Okay, okay, so what is the moral here? That we can break new ground by unimagining the computer? Or that programming success comes from brilliant, imaginative, obsessive social misfits?
Sigh.
Our two (or, counting Bill, three) data points seem to suggest some things that we might not want to believe regarding the kind of person who becomes a successful programmer. Let's try one more trip to the well of history.
John Backus headed up the team that developed Fortran and is the "B" in "BNF." One of the giants, in other words. According to Backus,
Programming in_the 1950s had a vital frontier enthusiasm virtually untainted by either the scholarship or the stuffiness of academia_.
Recognition in the small programming fraternity was more likely to be accorded for a colorful personality_or the ability to hold a lot of liquor well than it was for an intellectual insight. Ideas flowed freely along with the liquor_. An idea was the property of anyone who could use it_.
Aha! You can almost see the tattoos, the eye patches, the glint of the gold tooth, can't you? Yes, we need this alternative paradigm: the programmer as antigeek.
Copyright © 1995, Dr. Dobb's JournalThe Threat of Specialization
An Amusing Anecdote
The First Programmer