Free Novel Read

Dealers of Lightning Page 11


  Adding to the challenge, the hardware comprised an unwieldy mix of ancient and modern components. The processors were built using a brand-new, and far from bug-free, technology known as TTL (for "transistor-transistor logic"). But the read-only memory, which carried the machine's basic operating code, was an array of diodes soldered onto some 20 circuit boards—hundreds of thousands of tiny diodes each representing a digital bit. Editing the operating code during the debugging phase, Simonyi recalled, meant finding the errant diode on a foot-square circuit board, clipping it by hand with a the cutter, then drilling new holes and soldering a fresh diode in place. It was like edit­ing text with a hammer and chisel.

  More problems cropped up in designing the operating system. In com­puting, economies of scale often work in reverse: As a system grows larger it becomes exponentially more complicated. A computer designed to serve fifty users is not ten times more complicated than a machine for five users, but one hundred times, or a thousand. Berkeley's 500-user machine proved more complex than even this exquisitely talented team could handle. As Frederick Brooks would have predicted, when they finally got it working it did not work perfectly—and never well enough for the rated capacity of 500 users.

  The full-bore crash of expectations, however, came late in the com­pany's life cycle. For the first year or two BCC's fortunes resembled the rising curve of an arc. The company ramped up employment to more than 100 hardware and software engineers. The prototype's name got upgraded from the Berkeley 1 to the Berkeley 500, the bet­ter to express their confidence in its capacity (and, as Lampson said, "for marketing purposes").

  Pirtle had an artful way of squelching any doubts that might arise about the program. One recruit, an engineer named Ed Fiala who came from the Boston engineering firm of Bolt, Beranek & Newman, had taken the precaution of ordering a Dun & Bradstreet credit report on BCC before deciding whether to accept the job. "It said, 'Well, they're paying their bills but we don't exactly know how,'" Fiala recalled. He stifled his mis­givings and moved to Berkeley anyway arriving on a night when BCC was having a big party.

  "Everybody was all smiles and enthusiasm, and I asked Mel Pirtle how the company was doing," Fiala recalled. "He said, 'We're doing great!' I was a little concerned. I asked, What sort of financing do you have?' He said, 'Oh, we have lots. We just got three hundred thousand dollars.' And I asked how long that would last and he said, 'Six weeks.'

  "Now, three hundred thousand dollars sounded like a lot of money to me but the six weeks didn't seem all that long. So I said, 'Isn't that a lot less time than it's going to take you to complete your project?' He said, 'Well, yes . . . but we can always get more money.'"

  For a while that was true. Backers touring the BCC quarters invari­ably came away convinced by the staff’s high spirits that everything was on track. ("I know I tried to look enthusiastic," Fiala said.) But in 1970 the well ran dry. Not only had the recession dug in, but the technolog­ical risks confronted so cavalierly by BCC and other time-sharing com­panies turned out to be tougher than anyone expected. There were new questions about whether the machine would ever get done, and who would buy it if it did. Pirtle started to pare staff. Early in the year he placed those who were left on half salary and advised everyone to look for work.

  Around then Bob Taylor made his first appearance on the premises. To the younger programmers he cut quite the intriguing figure, a natty, self-assured individual to whom Pirtle and Lampson seemed to pay an unusual degree of respect—even deference. "I didn't know what to make of him," said Simonyi, still a cultural innocent living hand-to- mouth as a Berkeley undergraduate while working part-time at BCC. "I had this impression of a laid-back Playboy type, a Hugh Hefner type, good looking, good dresser, athletic, with the pipe of course, always with the pipe. Taylor never had the airs of the false technical B.S. artist. It was a plus that he didn't, but at the same time I had to ask myself: 'If he's not technical, not a technical B.S.'er, then what the hell is he?'"

  The answer was: A man energetically pursuing a deal. Having alerted Pake to BCC's financial problems and the likelihood of a rare cache of first-rank talent hitting the market, his initial idea was for Xerox to buy BCC outright and fold it whole into PARC, like a fresh egg into raw batter. That way PARC would acquire an advanced time-sharing proto­type along with at least twenty top people and a sizable complement of junior staff. Negotiations with Pirtle along those lines proceeded into the fall.

  But whether because Pirtle quoted too high a price or BCC failed too rapidly, the wholesale deal collapsed in favor of a sort of a la carte arrangement. Just as BCC filed for bankruptcy (one final drunken party on Friday, November 13, 1970, drained the last of its petty cash), PARC hired six of its best people—Lampson, Thacker, Deutsch, Fiala, a hard­ware designer named Richard Shoup, and a software programmer named Jim Mitchell. Pirtle was not interested in coming along. Instead he took over the management of a colossal government project to build the worlds first parallel-architecture supercomputer, the Illiac IV, at NASA's Ames Research Center in neighboring Mountain View (although some thought that the real reason was that he knew he would never get along with Taylor). Simonyi went with him, for the moment.

  The Berkeley 500, the only machine of its kind ever built, was pur­chased by ARPA on Taylor’s recommendation and shipped to the Uni­versity of Hawaii to allow that institution to join the ARPANET. The last employees of Berkeley Computer got it up and running in the cavernous building on Sixth Street, then watched with bittersweet emotions as it got crated up. "It was a very complex and interesting project," Fiala mused later. "It would have been fun to work on it for five years."

  Virtually at a single stroke, Taylor had completed his team—almost. He had the best hardware man (Thacker), the best designer of operat­ing systems (Lampson), and an entire cell of other computer science prodigies. PARC was missing only one thing: a philosopher.

  For there was still the obstacle that among the leading computer experts in the country, including those now on his payroll, very few agreed with him that the goal of computer design was to create a per­sonal machine that interacted with the user via a high-powered display. The BCC group, deeply rooted in the culture of time-sharing, was still intent on getting as many users hooked into a single machine as tech­nologically possible. That goal, as Wes Clark contended, remained incompatible with giving the individual the kind of speed and respon­siveness that interactivity required. When Taylor tried to explicate his notion of a display-based user interface, they tuned him out. They would not come around to his point of view for nearly two years.

  But one man was way ahead of them all. That one had written a doc­toral thesis at Utah in 1969 describing an idealized interactive com­puter called the FLEX machine. He had experimented with powerful displays and with computers networked in intricate configurations. On page after page of his dissertation he lamented the inability of the worlds existing hardware to realize his dream of an interactive per­sonal computer. He set before science the challenge to build the machine he imagined, one with "enough power to outrace your senses of sight and hearing, enough capacity to store thousands of pages, poems, letters, recipes, records, drawings, animations, musical scores, and anything else you would like to remember and change."

  To Taylor he was a soulmate and a profound thinker, capable of see­ing a computing future far beyond anything even he could imagine. Among the computer scientists familiar with his ideas, half thought he was a crackpot and the other half a visionary. His name was Alan Kay.

  CHAPTER B

  Not Your Normal Person"

  The hallmark mop of shaggy black hair is shot with gray, but as he nears a quite implausible sixty years of age, little else has changed. Certainly not the energy level, or the sneakers, so characteristic of his working uniform, or the unceasing effulgence from his mind of historical observation, moral instruction, and technological vision.

  "Conversations with Alan Kay aren't about any particular thing," says Carver Mead, a Ca
ltech professor who developed the technology of complex integrated circuits at PARC. "They're more a ramble through Ideaspace."

  Ideaspace Central today is divided between two Southern California locations about ten miles apart. One is Kay's home in an affluent part of Los Angeles. It is unassuming from the outside except for a tower­ing V-roofed addition. This curious annex was custom-built to shelter a two-story pipe organ professionally hand-crafted of exquisite blond spruce, on which Kay can be heard almost any morning practicing his favorite music by Buxtehude and J. S. Bach. ("Alan believed his role was to make it possible to build the organ, after which he would be the happy caretaker," remarked its architect, Greg Harrold.)

  The second location is a warehouse-like building in Glendale, a smoggy precinct of the San Gabriel Valley just north of L.A. Artfully arranged partitions and bookcases provide Kay with a spacious work area open to the floor through a doorless passage on one side—not too private, for he likes to spend the workday in constant stir, eliciting and dispensing ideas among his co-workers with equal generosity. He greets you wearing an oval name tag reading "Alan" and bearing a picture of Mickey Mouse. It should look ludicrous and it does, until you remember that this is the man whose playful digitized image of Cookie Monster launched the age of the personal computer. Or that he is now employed—as are two other mem­bers of the extraordinary team he assembled at PARC—by the Walt Dis­ney Company, which has entrusted him with helping to develop new ways to transmit story and idea from creator to audience.

  Alan Kay might have been the role model for the modern com­puter nerd, a Chuck Yeager for the generation that got engaged by the new technology in the 1970s. If you lived within that era's insular community of students and electronics nuts you knew his name, per­haps because you had read his lucid explications of microelectronics and software in Scientific American, or read an article featuring him in (of all places) Rolling Stone. You had been socially conditioned to feel ungainly and isolated by your devotion to machines and math; Alan Kay positively reveled in it, swaggered with it, declared in the pages of the counterculture bible itself that you and your awkward pals in all your nebbishy glory were the prophets of a new world in which computers and their unparalleled power would belong to the masses.

  The Computer Bum, as he enlightened Rolling Stone's readers, was someone who looked "about as straight as you'd expect hot-rodders to look. It's that land of fanaticism. He's a person who loves to stay up all night, he and the machine in a love-hate relationship." The hacker as rebel: Not an undernourished weirdo, merely someone "not very inter­ested in conventional goals."

  "Alan had been thrown out of every university in the country," recalled John Warnock, a mathematician who knew him first as a fellow graduate student at the University of Utah and later as a colleague at PARC. "He's not your normal person. He's a child prodigy who doesn't quite fit in with your normal academics."

  His wife, Bonnie MacBird, would transfer his personality to a char­acter (distribute it among several, actually) in her original screenplay for the first computer-animated high-tech thriller, a Disney film enti­tled Tron in which his boldness, his confidence, his exhilarating kines­thesia somehow survived the merciless dilution of Hollywood script doctoring. Alan Kay today is still the kind of person who communicates an impression of pure motion even when he is sitting down. As Carver Mead suggests, a conversation with him is an exhausting scaled-up affair. Once you get him talking he performs what he calls a "brain dump" on you, years of accumulated knowledge and synthesis pouring forth in a flood of narrative in which the protagonists are Alan Kay and the startling and visionary ideas he holds dear (many of them still deplorably unrealized), and their adversaries are managers, executives, bean-counters, corporate boards, schoolteachers, and all others who regard the unshackled imagination as a menace rather than a gift.

  Visible within the flood of ideas is the Alan Kay who made comput­ing cool. He declared publicly that it was all right to use three-million- dollar machines to play games and "screw around." If that meant grad students were blasting digital rocket ships off their computer screens in a game called "Spacewar," it was all part of the weaving of new tech­nology into the cultural fabric. His unashamed view of the computer as very much a toy liberated many others to explore its genius for proce­dures other than the parsing of numbers and the sequencing of databases—to see it, in other words, as a creative tool.

  This notion of technology as a means to an end still distinguishes Kay from most other practitioners of the art and science of technology. One factor in his powerful kinship with Bob Taylor was their shared curios­ity about what this machine could be made to do, more than how. Notwithstanding his incessant harangues, most of the inspired engi­neers Taylor recruited to CSL, the Lampsons and Thackers, started out too blindly focused on the issue of what was within their power to actually build. They would ask: What is the next stop on the road? Kay turned the question inside out: Let's figure out where we want to go, and that will show us how to get there. He never lost sight of the com­puter's appropriate station in the world: to conform to the user's desires, not the other way around.

  "It's almost impossible for most people to see technology as the tool rather than the end," he was saying one day in his cubicle at Disney Imagineering's Glendale warehouse. He was about to embark on another excursion through what Carver Mead called Ideaspace, where hyperbole and metaphor are equivalent coins of the realm (or obverse sides of the same coin). "People get trapped in thinking that anything in the environ­ment is to be taken as a given. It's part of the way our nervous system works. But it's dangerous to take it as a given because then it controls you, rather than the other way around. That's McLuhan's insight, one of the bigger ones in the twentieth century. Zen in the twentieth century is about taking things that have been rendered invisible by this process and trying to make them visible again.

  "Parents ask me what they should do to help their kids with science. I say, on a walk always take a magnifying glass along. Be a miniature exploratorium. . . ."

  You would have to know something about his life to recognize this as a scene from his childhood. Kay's father was a scientist, a physiologist engaged in designing prostheses for arms and legs. "I can certainly remember going on walks with him," Kay recalled. "And when you go on a walk with a parent who's interested in science, it's usually about all the things that you can't readily see."

  This sort of unleashed curiosity would allow him to recognize new ways of placing computing power in everyone's hands. But he had to travel a fair distance before discovering that his destiny lay in the arcane science of systems programming. That might never have happened at all had cir­cumstances not left him becalmed on an Air Force base in Waco, Texas, in the suspended state of existence known as "Figmo."

  The term is a military acronym for "Fuck it, got my orders." As always with service slang, one can hardly think of a better way to describe the condition. It was 1961 and Kay was marking off the last two years of his enlistment. At this moment he was working in the pathology lab at James Conway AFB, on the verge of being transferred on.

  "I was in figmo, when you're at your old base but everybody knows you're about to go somewhere else. You're not for real anymore on this old base. You sit around and play cards and read books, one of the best things in the military. I was trying to get a little better at poker with a figmo who was a professional poker player, the trick being to see if I could make it a learning experience instead of just getting fleeced."

  But if Kay could not be ordered to do a damn thing pending his transfer, his state of enforced idleness left him wide open to being enticed. In this case the enticement was the scheduling of an aptitude test for computer programmers. No one from Conway had ever passed this test. To a prodigy, however, any standardized test is like a carnival midway. Kay, whose mind was as nimble as it was underemployed, viewed it as a lark. "No way I'd ever pass up a test," he said. Naturally, he passed handily.

  As luck would
have it, the Air Force did not view programmer train­ing quite so casually Undergoing a full-scale conversion from primitive punch card tabulators to the IBM 1401, the world's first popular gen­eral business computer, the service was pulling linguists out of Europe to turn them into programmers and scouring the ranks for anyone showing the slightest ability.

  "They figured that since you'd taken this test, IBM could teach you to program the 1401 from scratch in one week," Kay recalled. "It wasn't computer science, just training, but it was the best training I've ever had. You worked your ass off and at the end of the week you could program a computer."

  On the surface Kay seemed an unlikely candidate to take to the rig­ors of instructing a machine how to operate, for by habit he responded poorly to rules and regulations not his own. The explanation, however, lies in how a computer's stern and unyielding logical rules can lead to infinitely creative results.

  Computers look smart, but their intelligence is a fraud, a sleight- of-hand stunt abetted by blinding speed and a capacity for infinite reiter­ation. They must be instructed how to perform every tiny step of a problem of ratiocination, and in what sequence. That is why nothing that ever happens inside a computer is entirely unexpected (unless it is going wrong). The machine has been shown the way by its programmer, like a child taken for a stroll along a garden path. Both partners know the rules of the journey. Let the programmer stray one step off the path—let's say by coding a command that violates the machine's logic—and the com­puter will refuse to follow. Let the computer break the rules—refusing to take the next step along the mandated path—and the programmer will know it is sick and must be cured before they can take even one more stride together.