Idea Man Read online




  IDEA MAN

  A MEMOIR BY THE CO-FOUNDER OF MICROSOFT

  PAUL ALLEN

  PORTFOLIO PENGUIN

  Published by the Penguin Group

  Penguin Books Ltd, 80 Strand, London WC2R 0RL, England

  Penguin Group (USA) Inc., 375 Hudson Street, New York, New York 10014, USA

  Penguin Group (Canada), 90 Eglinton Avenue East, Suite 700, Toronto, Ontario, Canada M4P 2Y3

  (a division of Pearson Penguin Canada Inc.)

  Penguin Ireland, 25 St Stephen’s Green, Dublin 2, Ireland

  (a division of Penguin Books Ltd)

  Penguin Group (Australia), 250 Camberwell Road, Camberwell, Victoria 3124, Australia (a division of Pearson Australia Group Pty)

  Penguin Books India Pvt Ltd, 11 Community Centre, Panchsheel Park, New Delhi – 110 017, India

  Penguin Group (NZ), 67 Apollo Drive, Rosedale, Auckland 0632, New Zealand

  (a division of Pearson New Zealand Ltd)

  Penguin Books (South Africa) (Pty) Ltd, 24 Sturdee Avenue, Rosebank, Johannesburg 2196, South Africa

  Penguin Books Ltd, Registered Offices: 80 Strand, London WC2R 0RL, England

  www.penguin.com

  First published in the United States of America by Portfolio/Penguin, a member of Penguin Group (USA) Inc. 2011

  First published in Great Britain by Portfolio Penguin 2011

  Copyright © MIE Services LLC, 2011

  All rights reserved

  The moral right of the author has been asserted

  Grateful acknowledgement is made for permission to reprint an excerpt from ‘Purple Haze’, written by Jimi Hendrix, published by Experience Hendrix, L.L.C. Used by permission. All rights reserved.

  Except in the United States of America, this book is sold subject to the condition that it shall not, by way of trade or otherwise, be lent, re-sold, hired out, or otherwise circulated without the publisher’s prior consent in any form of binding or cover other than that in which it is published and without a similar condition including this condition being imposed on the subsequent purchaser

  ISBN: 978-0-14-196938-1

  FOR MY PARENTS

  CONTENTS

  1 Opportunity

  2 Roots

  3 Lakeside

  4 Acolytes

  5 Wazzu

  6 2+2=4!

  7 MITS

  8 Partners

  9 SoftCard

  10 Project Chess

  11 Borrowed Time

  12 Wake-Up Call

  13 Hellhounds

  14 Blazermania

  15 12th Man

  16 Space

  17 Jimi

  18 Wired World

  19 Fat Pipe

  20 Searching

  21 Mapping the Brain

  22 Adventure

  In Sum

  Acknowledgments

  Appendix

  CHAPTER 1

  OPPORTUNITY

  As I walked toward Harvard Square on a December weekend afternoon in 1974, I had no inkling that my life was about to change. The weather was snowy and cold, and I was twenty-one years old and at loose ends. My girlfriend had left a few weeks earlier to return to our hometown of Seattle three thousand miles away. I was three semesters shy of graduation at Washington State University, where I’d taken two breaks in the last two years. I had a dead-end job at Honeywell, a crummy apartment, and a ’64 Chrysler New Yorker that was burning oil. Unless something came along by summer, I’d be going back myself to finish my degree.

  The one constant in my life those days was a Harvard undergraduate named Bill Gates, my partner in crime since we’d met at Lakeside School when he was in eighth grade and I was in tenth. Bill and I learned how to dissect computer code together. We’d started one failed business and worked side by side on professional programming jobs while still in our teens. It was Bill who had coaxed me to move to Massachusetts with a plan to quit school and join him at a tech firm. Then he reversed field to return to college. Like me, he seemed restless and ready to try something new.

  Bill and I kept casting about for a commercial project. We figured that we’d eventually write some software, where we knew we had some talent. Over grinders or a pepperoni pie at the Harvard House of Pizza, we fantasized about our entrepreneurial future. One time I asked Bill, “If everything went right, how big do you think our company could be?”

  He said, “I think we could get it up to thirty-five programmers.” That sounded really ambitious to me.

  Since the dawn of integrated-circuit technology in the 1950s, forward thinkers had envisioned ever more powerful and economical computers. In 1965, in a journal called Electronics, a young research physicist named Gordon Moore made that prediction specific. He asserted that the maximum number of transistors in an integrated circuit would double each year without raising the chip’s cost. After cofounding Intel in 1968, Moore amended the rate of doubling to once every two years—still dramatic. Similar trends soon emerged in computer processing speed and disk storage capacity. It was a simple but profound observation that holds true to this day. Because of continual advances in chip technology, computers will keep getting markedly faster and cheaper.

  The momentum of Moore’s law became more evident in 1969, a few months after I’d met Bill. (I was sixteen then, just learning to program on a mainframe computer.) A Japanese company called Busicom asked Intel to design chips for a cheap handheld calculator that could undercut the competition. Busicom assumed that the new machine would require twelve integrated-circuit chips. But Ted Hoff, one of Intel’s electrical engineers, had a bold idea: to shave costs by consolidating the components of a fully functioning computer onto a single chip, what came to be called a microprocessor.

  Before these new chips arrived on the scene, it took dozens or hundreds of integrated circuits to perform one narrow function, from traffic lights to gas pumps to printer terminals. Microwave-oven-size minicomputers, the machines that bridged mainframes and the microcomputers yet to come, followed the same formula: one chip, one purpose. But Hoff’s invention was far more versatile. As Gordon Moore noted, “Now we can make a single chip and sell it for several thousand different applications.” In November 1971, Moore and Robert Noyce, the co-inventor of the integrated circuit, introduced the Intel 4004 microchip at a price of $200. The launch advertisement in Electronic News proclaimed “a new era of integrated electronics.”

  Few people took notice of the 4004 early on, but I was a college freshman that year and had time to read every magazine and journal around. It was a fertile period for computers, with new models coming out almost monthly. When I first came across the 4004, I reacted like an engineer: What cool things could you do with this?

  At first glance, Intel’s new chip looked like the core of a really nice calculator. But as I read on, I could see that it had all the digital circuitry of a true central processing unit, or CPU, the brains of any computing machine. The 4004 was no toy. Unlike application-specific integrated circuits, it could execute a program from external memory. Within the limits of its architecture, the world’s first microprocessor was more or less a computer on a chip, just as the ads said. It was the first harbinger of the day when computers would be affordable for everyone.

  Four months later, as I continued to “follow the chips,” I came across the inevitable next step. In March 1972, Electronics announced the Intel 8008. Its 8-bit architecture could handle far more complex problems than the 4004, and it addressed up to sixteen thousand (16K) bytes of memory, enough for a fair-size program. The business world saw the 8008 as a low-budget controller for stoplights or conveyor belts. (In that vein, Bill and I would later use it in our fledgling enterprise in traffic flow analysis.) But I knew that this second-generation microchip could do much more, given the chance.
r />   My really big ideas have all begun with a stage-setting development—in this case, the evolution of Intel’s early microprocessor chips. Then I ask a few basic questions: Where is the leading edge of discovery headed? What should exist but doesn’t yet? How can I create something to help meet the need, and who might be enlisted to join the crusade?

  Whenever I’ve had a moment of insight, it has come from combining two or more elements to galvanize a new technology and bring breakthrough applications to a potentially vast audience. A few months after the 8008 was announced, one of those brain waves came to me. What if a microprocessor could run a high-level language, the essential tool for programming a general-purpose computer?

  It was plain to me from the outset that we’d use BASIC (Beginner’s All-Purpose Symbolic Instruction Code), the relatively simple language that Bill and I learned back at Lakeside in our first computer experience. The latest minicomputer from Digital Equipment Corporation, the PDP-11, already ran the more complex FORTRAN on as little as 16K of memory. While an 8008 machine would be quite a bit slower, I thought it should be able to perform most of the same functions at a fraction of the PDP-11’s cost. Ordinary people would be able to buy computers for their offices, even their homes, for the very first time. An 8008 BASIC could swing open the gate to an array of applications for a limitless clientele.

  And so I asked Bill, “Why don’t we do a BASIC for the 8008?”

  He looked at me quizzically and said, “Because it would be dog-slow and pathetic. And BASIC by itself would take up almost all the memory. There’s just not enough horsepower—it would be a waste of time.” After a moment’s reflection, I knew he was probably right. Then he said, “When they come out with a faster chip, let me know.”

  Bill and I had already found a groove together. I was the idea man, the one who’d conceive of things out of whole cloth. Bill listened and challenged me, and then homed in on my best ideas to help make them a reality. Our collaboration had a natural tension, but mostly it worked productively and well.

  Long before coming to Massachusetts, I’d been speculating about the next-generation chip, which had to be coming soon. I was sure someone would build a computer around it—something like a minicomputer, but so inexpensive that it would recast the market. Writing to Intel to find a local 8008 vendor for our traffic machine, I asked about their future plans. On July 10, 1972, a manager named Hank Smith responded:

  We do not intend to introduce any chips in the future which will obsolete the 8008. Our strategy will be to introduce a new family of devices which will cover the upper end of the market (the point where the 8008 leaves off, up through mini computers). … The introduction for the new family of devices is targeted for mid 1974.

  I had no way of knowing that Federico Faggin, the great chip designer, was already pushing Intel management to start work on the Intel 8080, to be heralded by Electronics in the spring of 1974. The newest microprocessor could address four times as much memory as its predecessor. It was three times as powerful and much easier to program. Hank Smith was wrong; the 8008 would soon be obsolete. As Faggin would say, “The 8080 really created the microprocessor market. The 4004 and 8008 suggested it, but the 8080 made it real.”

  One thing seemed certain: The 8080 met the criteria for a BASIC-ready microprocessor. As soon as I read the news, I said to Bill, “This is the chip we talked about.” I regaled him with the 8080’s virtues, not least its bargain price of $360. Bill agreed that the 8080 was capable and the price was right. But writing a new BASIC from scratch was a big job, something we’d never done, and the fact remained that no computer existed to run it on. Which meant there was no market. “You’re right, it’s a good idea,” he said. “Come back and tell me when there’s a machine for it.”

  I kept prodding Bill to reconsider, to help me develop an 8080 BASIC before someone beat us. “Let’s start a company,” I’d say. “It’ll be too late if we wait—we’ll miss it!” In my journal entry dated October 23, 1974, I wrote: “Saw Bill Monday night and we may end up writing Basic Compiler/Operating System for 8080.” But that was wishful thinking. Bill wasn’t ready, and I couldn’t forge ahead without him. The whole point of my moving to Boston had been for us to do something special as a team.

  We both knew that big changes were coming. But we didn’t know what shape they’d take until that chilly December day in Harvard Square.

  OUT OF TOWN NEWS sat in the middle of the square. It was near the Harvard Coop, where I occasionally nosed around for books, and across the street from Brigham’s Ice Cream, where Bill and I went for chocolate shakes. I’d stop by the stand each month to check on periodicals like Radio Electronics and Popular Science. I’d purchase any that caught my eye, passing over the covers that hyped build-your-own ham radio transmitters.

  Like most magazines, Popular Electronics was postdated by a week or two. I was hunting for its new January issue—which stopped me in my tracks. The cover headline looked like this:

  PROJECT BREAKTHROUGH!

  World’s First Minicomputer Kit

  to Rival Commercial Models …

  “ALTAIR 8800” SAVE OVER $1000

  Beneath the large-font type was a gray box with rows of lights and binary switches on its front panel, just the sort of thing I’d been imagining.* Given the magazine’s frugal, do-it-yourself readership, I knew there had to be a single microprocessor inside; hordes of conventional chips would have cost too much. One question remained: Was that microprocessor the limited Intel 8008 or the turbocharged 8080? I suspected—I hoped—for the 8080.

  I plucked a copy from the rack and riffled through it, my anticipation rising. I found the story on page 33, with another photo of the Altair and a harder-sell headline:

  ALTAIR 8800

  The most powerful minicomputer

  project ever presented—can be built

  for under $400.

  The first sentence of the text, by H. Edward Roberts and William Yates of MITS, the machine’s manufacturer, was the stuff of Allen-Gates dreams: “The era of the computer in every home—a favorite topic among science-fiction writers—has arrived!” The Altair represented “a full-blown computer that can hold its own against sophisticated minicomputers now on the market,” but “in a color TV receiver’s price class.”

  The next paragraph clinched it: “In many ways, [the Altair] represents a revolutionary development in electronic design and thinking. … Its central processing unit is a new LSI [large-scale integration] chip that is many times more powerful than previous IC processors.” That CPU was the 8080. Bill’s got his answer now! I thought.

  I slapped down seventy-five cents and trotted the half-dozen slushy blocks to Bill’s room in Harvard’s Currier House. I burst in on him cramming for finals; it was that time of year. “You remember what you told me?” I said, feeling vindicated and a little breathless. “To let you know when somebody came out with a machine based on the 8080?”

  “Yeah, I remember.”

  “Well, here it is,” I said, holding out the magazine with a flourish. “Check it out!”

  As Bill read the story, he began rocking back and forth in his chair, a sign that he was deep in concentration. I could tell he was impressed. “It’s expandable, just like a minicomputer,” he murmured. Priced at $397 in kit form, scarcely more than a retail 8080 chip alone, the base Altair came with only 256 bytes of memory, just enough to program its lights to blink. But more could be added with plug-in memory cards. Throw in an input/output board and an I/O audiocassette recorder* or a rented Teletype, and you’d have a working machine for under two thousand dollars. Affordability would change everything—not just for hobbyists, but for scientists and businesspeople. And it seemed likely that the Altair could run an interactive language like BASIC, the idea dancing in my head for the past three years.

  We were looking at the first commercial personal computer.

  Bill set the magazine down, and we planned our next move. The good news was that our train was leavin
g the station at last. The bad: We had no idea if we’d be in time to board. Though the article made vague references to BASIC and FORTRAN, it wasn’t clear whether MITS already had 8080-based languages available or in development. In either case, we’d be sunk.

  Hoping for the best, we sent a letter to the company’s president on our old traffic-machine business stationery, implying that we had a BASIC ready to roll out. When we didn’t hear back, we followed up with a phone call. “You should talk to them. You’re older,” Bill said.

  “No, you should do it, you’re better at this kind of thing,” I said. We compromised: Bill would make the call but would say he was me. When it came time to meet with MITS face-to-face, our thinking went, I’d be the one to make the trip. I had my beard going and at least looked like an adult, while Bill—who’d routinely get carded into his thirties—still could pass for a high school sophomore.

  “Ed Roberts.”

  “This is Paul Allen in Boston,” Bill said. “We’ve got a BASIC for the Altair that’s just about finished, and we’d like to come out and show it to you.” I admired Bill’s bravado but worried that he’d gone too far, since we’d yet to write the first line of code.

  Roberts was interested, but he was getting ten calls a day from people with similar claims. He told Bill what he’d told everyone else: The first person to walk through his door in Albuquerque with a BASIC that worked would get a contract for the Altair. (As Ed later retold the story in his inimitable style, he’d settled on BASIC because you “could teach any idiot how to use [it] in no time at all.”) There was nothing we could do for the moment, he said. MITS was still debugging its in-house memory cards, which they’d need to run a BASIC demo on the Altair. They’d be ready for us in a month.

  The whole conversation took five minutes. When it was over, Bill and I looked at each other. It was one thing to talk about writing a language for a microprocessor and another to get the job done. Later I’d discover that MITS’s own engineers doubted that an 8080 BASIC was possible.