Until very recently, the architects of human-machine interfaces have been unsung heroes (and villains) in a rarely visible and even-more rarely appreciated specialty. For example, if you look inside the case of a classic black desk telephone, you don't find the names of the patient designers and researchers at AT&T who spent years refining the ergonomics of that humble instrument. There was no equivalent of Steven Jobs at Bell Labs to Immortalize these fellows! Similarly, most of the older literature on human-machine interfaces lies far off the beaten path in obscure journals, conference proceedings, and graduate student theses.
But the rapid proliferation of PCs and their "productivity" applications over the last decade has focused public attention on interface issues that were previously only dimly perceived, because the interaction between a user and a computer program is so much more intimate and intense than the interface between a user and a toaster, a telephone, or even a typewriter. You can put up with a toaster whose darkness control is difficult to adjust, or a telephone that is shaped like Mickey Mouse, but it's frustrating and aggravating to work with a computer program whose interface is inconsistent, inefficient, or unforgiving. When you're forced to use a "bad" program, you take it personally.
Unfortunately, the track record of the PC software industry with regard to human interfaces has not been good. Experienced interface designers are often brought into the software development process late, or are not consulted at all. As Ted Nelson of Xanadu fame has commented:
Historical accident has kept programmers in control of a field in which most of them have no aptitude the artistic integration of the mechanisms they work with. It is nice that engineers and programmers and soft ware executives have found a new form of creativity with which to find a sense of personal fulfillment. It is just unfortunate that they have to inflict the result on users.
The Macintosh stands out as the one personal computer where human interface designers, psychologists, and graphics artists were involved with the hardware and software from its earliest stages. The elegance of the Macintosh System and Finder and the powerful, yet consistent, user interface found in Macintosh applications are not a happy accident. They have resulted from a determined effort by Apple to establish, evolve, and evangelize the user-interface guidelines that were created with the help of these consultants, and to stigmatize applications that flout the guidelines as "un-Mac-like."
DOS and UNTX software vendors, on the other hand, have only awakened to esthetic considerations rather recently. Up until a year or two ago, the "graphical user interface" of a Sun workstation was little more than a clock icon and a command-line interpreter running in a movable window. Similarly, the graphical user interface of Microsoft Windows, Versions 1.03 and 2.x, was notorious for its clumsiness, ugliness, and the counterintuitive assignments of its so-called "accelerator" keys. (Microsoft publicized its enlistment of graphics designers in Windows, Version 3, but this mainly called attention to its failure to consult such experts much earlier.)
Addison Wesley's recently-released book The Art of Human Computer Interface Design started out as an internal Apple project to organize and document the company's experience for the benefit of new employees. As the project became more ambitious, it was transformed into a book proposal, and a call for papers was sent our. Brenda Laurel, the editor for the project, collected proposals and commissioned articles over a two-year period. The first drafts of the articles were distributed to all of the authors, the authors were gathered together for a three day conference at Asilomar to share ideas, and then the papers were revised a second and third time to build in cross-references and add discussions of recent developments.
The result of this cross pollination is a massive opus divided rather arbitrarily into five sections; Creativity and Design, Users and Contexts, Sermons, Technique and Technology, and New Directions. The 50-odd chapters are a hodgepodge of essays, interviews, anecdotes, musings, and polemics -- ranging from insightful (Alan Kay and Scott Kim) to vague and pretentious (Jean-Louis Gassee) to virtually incomprehensible (Timothy Leary). The most engaging material is often found where you least expect it, such as the description of a touch-screen/Mac II/voice synthesis interface constructed by Apple researchers for Koko, a 260-pound gorilla. (The computer's case is built out of one-half inch carbonate, one-inch by two-inch aluminum girders, and one-inch tempered glass, with slots for passive ventilation that channel foreign materials such as bananas and excrement away from the CPU.)
Although The Art of Human-Computer Interface Design is enjoyable reading and is well worth your time, the quality of the writing is somewhat uneven and the focus is erratic. Furthermore, the book is not cohesive or detailed enough to serve as a primary textbook or as a design guide for professionals. The emphasis on the Apple graphical user interface also limits the book's usefulness -- I'm happy to stipulate that the Apple System 7 is the best GUI on a mass-market computer today, but there are certainly worthwhile innovations in NextStep, Motif, and even (heaven help us) OS/2 Presentation Manager that could have been discussed, not to mention the stylus-oriented platforms such as PenPoint that are looming on the horizon.
The most dissappointing aspect of this book, for me, is its conservatism. Within a very few years, near-microscopic self-contained multi-MIPS 32-bit (or 64-bit?) processors will be manufactured for pennies, wireless networking will be the norm, and real-time access to information services will be available at any point on the earth's surface, courtesy of satellites and digital cellular telephony. The implications are literally mind-boggling. But most of the authors of The Art of Human-Computer Interface Design don't see farther ahead than two-handed Mac manipulations (a track ball in the left hand and a mouse in the right), semi smart voice-mail, and head-mounted displays combined with solenoid-encrusted gloves for cartoon-like virtual realities.
Where might we find more adventurous thinking on human-computer interfaces? The literary genre known as science fiction seems like a reasonable candidate. Over the years, science fiction authors have "predicted" a host of technological advances, from geosynchronous communication satellites to computer viruses. But a preliminary glance into the science fiction classics isn't likely to impress you. Almost without exception, the "Old Masters" were unable to project computer technology in any direction other than bigger and better mainframes, some of which became sentient merely as a function of their size rather than as the result of any advances in hardware or software. For illustrations of this focal imaginative deficit, see 2001 (Arthur C. Clarke), the Foundation series (Isaac Asimov), Cities In Flight (James Blish), Colossus (D.F. Jones), and The Moon is a Harsh Mistress (Robert A. Heinlein).
However, there is an emerging group of science fiction authors who have grown up with computer technology and can conceive of more provocative outcomes for the human-computer interface. The novels written by these authors are collectively known as "cyberpunk" and are characterized by a vivid and sometimes disorienting "in-your-face" style, as well as an aggressive extrapolation and incorporation of cutting-edge technology. The cyberpunk authors also seem to share a fatalistic outlook that the world will inexorably become more noisy, dirty, stressful, crowded, and corrupt -- but perhaps this is just the angst of the "X Generation" rather than a property of cyberpunk per se.
For those of you who are interested in having your consciousness raised, broadened, or perhaps even assaulted a little, I'm going to make a few cyberpunk reading recommendations. The first, last, and most important book to buy is Neuromancer, by William Gibson. The characters in Neuromancer inhabit a frantic, depersonalized world where all true political power has devolved to multinational cartels descended from the Japanese zaibatsus, prosthetic organs, and mind-altering drugs are as readily available as soft drinks (if you have the money), and the age-old mortal sins have been supplanted by the new crimes and vices related to the theft or subversion of other people's data. Gibson crystallized the concept of "cyberspace"--hustlers of his world "jack in" to "cyberspace decks" with which they can perceive the world-girdling networks directly as a surreal terrain called "the matrix."
"The matrix has its roots in primitive arcade games," said the voice-over, "in early graphics programs and military experimentation with cranial jacks." On the Sony, a two-dimensional space war faded behind a forest of mathematically generated ferns, demonstrating the spacial possibilities of logarithmic spirals; cold blue military footage burned through, lab animals wired into test systems, helmets feeding into fire control circuits of tanks and war planes. "Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts... A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding..."
He settled the black terry sweatband across his forehead, careful not to disturb the flat Sendai electrodes. He stared at the deck in his lap, not really seeing it, seeing instead the shop window on Ninsei, the chromed shuriken burning with reflected neon... He closed his eyes. Found the ridged face of the power stud. And in the blood-lit dark behind his eyes, silver phosphenes boiling in from the edge of space, hypnagogic images jerking past like film compiled from random frames. Symbols, figures, faces, a blurred fragmented mandala of visual information... A gray disk, the color of Chiba sky. Disk beginning to rotate, faster, becoming a sphere of paler grey. Expanding--
And flowed, flowered for him, fluid neon origami trick, the unfolding of his distanceless home, his country, transparent 3D chessboard to infinity. Inner eye opening to the stepped scarlet pyramid of the Eastern Seaboard Fission Authority burning beyond the green cubes of Mitsubishi Bank of America, and high and very far away he saw the spiral arms of military systems, forever beyond his reach.
Bruce Sterling's Islands in the Net is, on the surface, a vision of a kinder, gentler future that Gibson's. His characters jog on the beach and are ecologically oriented, the nuclear family in some form or other is still important, a New-Age philosophy of the Optimal Persona and a low-energy, Gaia-friendly school of architecture prevails. All this, however, plays out against a backdrop of political fragmentation, high-tech terrorism, tiny third-world nations that survive by harboring data pirates and skimming a percentage of their profits, and near-total depletion of the world's natural resources. And again, the whole world's facilities for information storage, business transactions, education, and even recreation have been subsumed into an all-knowing, all-embracing, all-prevading network.
Islands in the Net is one of the most absorbing science fiction books I've ever read, but I also found it to be one of the most disturbing. This is, I guess, because the near-future world it paints feels frighteningly plausible; everything is a reasonable extrapolation of existing trends, with no appeal to the invention of magical new technologies. In contrast to Sterling, who manipulates global sociopolitical issues with ease, Pat Cadigan turns the reader inward and explores the layers of consciousness. Her excellent but little-known book Mindplayers portrays a society where one of the forbidden forms of amusement is the induction of artificial psychoses with a direct computer-to-mind interface called a "madcap." Those who get caught by the Brain Police are required to undergo corrective theraphy.
As the final item in this little cyberpunk sampler, I offer Vernor Vinge's Marooned in Realtime. Although this book is set in the same future world as Vinge's earlier book, The Peace War, it addresses a far more cosmic question--the next evolutionary step for the human race--embedded in a crackling detective yarn. The narrator, an ex-cop, is a sort of Rip Van Winkle of the space age; he and a handful of others emerge from stasis bubbles to find the residues of an unimaginably high-tech infrastructure on an empty world. With the physical evidence blurred past recognition by the passage of millions of years, the question of the age is: Where did everyone else go? Some interpret the event as an epidemic, others as a war between nations or with invading aliens. But one of the other survivors speculates:
"We were on the exponential track... By 2200, all but the blind could see that something fantastic lay in our immediate future. We had practical immortality. We had the beginnings of interstellar travel. We had networks that effectively increased human intelligence--with bigger increases coming... And intelligence is the basis of all progress. My guess is that by mid-century, any goal--any goal you could state objectively without internal contradictions--could be achieved. And what would life be like fifty years after that? There would still be goals, and there would still be striving, but not what we could understand.
"To call that time 'The Extinction' is absurb. It was a Singularity, a place where extrapolation breaks down and new models must be applied. And those new models are beyond our intelligence... Mankind simply graduated, and you and I and the others missed graduation night."
Vinge's anticipation of a Singularity brings me around to the nonfiction book I was thinking of when I began writing this column: Disappearing Through the Skylight, by O.B. Hardison, Jr. This book discusses how the accelerating pace of developments in computers, quantum physics, cosmology, chaos theory, molecular biology, and numerous other disciplines have already thrust us through what amounts to a Singularity. Our understanding of the universe and our place in it has changed in fundamental ways, and the reverberations of this change find their expression in such mediums as concrete poetry, neomodernist architecture, dissonant music, and minimalist styles of painting.
Computers now share the human environment. Most obviously they exhibit rudimentary intelligence. They also have been equipped with arms and grippers and legs, and in this form they have begun to act physically on the world around them and modify it. Inevitably, they affect the sense of human identity. Is the mind a machine--and a relatively simple one at that, once the trick of programming with neurons is understood? Is the claim of humanity to uniqueness disappearing along with the claim of each human to a separate identity shaped by a local habitation and a name? Is the idea of what it is to be human disappearing, along with so many other ideas, through the modern skylight?
In its fearless exploration of inner and outer worlds, modern culture has evidently reached a turning point--a kind of phase transition from one set of values to another. Crossing the barrier that separates the phases is another kind of disappearance. The nature of that barrier is nicely characterized in a phrase developed by science in connection with the search for extraterrestrial life: "horizon of invisibility." A horizon of invisibility cuts across the geography of modern culture. Those who have passed through it cannot put their experience into familiar words and images because the languages they have inherited are inadequate to the new worlds they inhabit. They therefore express themselves in metaphors, paradoxes, contradictions, and abstractions rather than languages that "mean" in the traditional way--in assertions that are apparently incoherent or collages using fragments of the old to create enigmatic symbols of the new. The most obvious case in point is modern physics, which confronts so many paradoxes that physicists like Paul Dirac and Werner Heisenberg have concluded that traditional languages are, for better or worse, simply unable to represent the world that science has forced on them. In "Quantum Mechanics and a Talk with Einstein," Heisenberg remarks, "I assume that the mathematical scheme works, but no link with the traditional language has been established so far." The same comment might be made about the relation between the twentieth-century languages of Cubism, collage, Dada, and concrete poetry and the visual and verbal languages that preceded them.
Disappearing Through the Skylight is so wide-ranging that it defies summarization here. Hardison explains and critiques everything from modern poetry to experiments in artificial reality with an insight and authority that most of us would be delighted to be able to apply to a single discipline. The book's theme is not computers, but computers are found in every part of the book, because computers are rapidly becoming an integral part of our culture (and hence, "disappearing"). Hardison's analysis of the current research into artificial intelligence is fascinating, and his speculations on the evolution of silicon life are startling. Buy this book.
Copyright © 1991, Dr. Dobb's JournalThe Interface World According to Apple
When Life Imitates Art
Through the Looking Glass