Do-it-yourself giant brains!

From punch cards to Linux, hackers love to tinker and share. Even Bill Gates can't stop them.

Published June 22, 2000 7:20PM (EDT)

Godless Russians and Communist film directors weren't the only bogeymen who gave Cold War-era Los Angeles the heebie-jeebies. In the early '50s, the Los Angeles Police Department confronted a truly destabilizing threat: pinball.

City authorities, considering pinball machines to be implements of vice and corruption, would break them up for spare parts, consigning the debris to bins in the police department's electronics shop at downtown's Lincoln Jail. Among the parts stockpiled at the jail were heaps of solenoids -- electromagnetic coils used for initiating pinball plunger and bumper action.

Solenoids are really, really good at facilitating on/off flip-flops, mechanically opening and closing circuits depending upon the flow of electric current. During World War II the solenoids were a vital military resource material. But afterwards, they sat unused -- until a couple of teenaged proto-hackers named Phil Cramer and Bill Fletcher came along.

Cramer and Fletcher knew just what to do with leftover solenoids. They would be perfect for constructing Giant Brains!

Like many other aspiring electronics geeks in the immediate postwar era, Cramer and Fletcher's imaginations had been enticed by a book published in 1949 called "Giant Brains, or Machines That Think." Written by Edmund C. Berkeley, an expert in the then-infant field of computing, "Giant Brains" was both a primer and a manifesto.

In language that managed the delicate trick of being exquisitely clear and uncompromisingly evangelistic, Berkeley described how a computer works, step by step, instruction by instruction. Employing numerous diagrams, and painstakingly explaining every underlying concept (like "binary" or "register" or "input/output") as if it had never been explained before, Berkeley demonstrated how it was possible to move digital information from one "place" to another -- and how a set of on/off switches, if wired correctly, could perform operations on that information, handling such extraordinary feats as the addition of two plus two.

In 1952, Berkeley walked the walk. He built his own computer, Simon, considered by some historians to be the first "personal computer," and documented the process in a series of 13 articles for Radio Electronics magazine. Cramer and Fletcher, demonstrating a cavalier attitude towards proprietary information that would become a calling card for do-it-yourself hackers in generations to come, ripped the pages of schematics right out of copies of the magazines at their local library. (As Cramer noted shamefacedly, 50 years later, "We had no duplicators in those days!")

Fletcher had a contact within the police department who let them rummage through the bins of electronics. Following Berkeley's instructions, the two teenagers built a simple solenoid relay-driven computer. It didn't work exactly as planned, but it did something, and that was enough. Enough to get Cramer's father, an accountant, to give the youngsters $50 to buy more parts. Enough to encourage them to try again, to build another simple machine that did work. Enough, in the case of Phil Cramer, to launch him into a life spent tending Giant Brains, a career of computer programming that continues to this day.

Dig under the surface of your average computer geek and you will find a person in love with the idea of having a Giant Brain of one's own to play with. As computer scientist Dick Karpinski observes, computers, or to be precise, the act of programming computers, "is the only way to have socially acceptable slaves."

Over the decades, the opportunity to harness the power of a computer to one's own selfish purposes, whatever those may be, has proven irresistibly seductive. From the '50s kids who gravitated inexorably to IBM mainframes to the Homebrew Computer Club tinkerers who built the first personal computers in the '70s to the Linux hackers exchanging tips and tricks in their user's groups in the '90s, the underlying passion is identical: It is a whole lot of fun to be the master of a Giant Brain, down to the very last binary one or zero.

And anything that hinders that mastery is resented. Abhorred. Reviled. Detested. It is no accident that the hacker of the '50s and '60s despised IBM while his counterpart in the '80s and '90s denounced Microsoft. They got in the way! The love affair that so many programmers have with free software isn't reducible to mere respect for an efficient software development methodology. It is also an expression of the programmer's age-old craving to be in intimate control of every aspect of the machine -- and unwillingness to allow any barriers to block the Source.

Of course, when programmers like Cramer got started, there was no real difference between hardware and software. It was all just stuff that made the machine go. You might be inputting instructions by floppy disk, or magnetic tape, or punch cards, or paper tape, or even by wiring plugs together -- the medium simply didn't matter: The point was to get the machine to work.

Hackers have always understood this -- and indeed, in the earliest days, there was little need to worry about any separation. Hardware came with software, and you fiddled with both until you got the machine to do what you wanted it to do. But over the decades since the '50s the growth of the commercial software industry, combined with the increasing complexity of software, has worked to divide programmers from the object of their passions. As programmers began to be consigned to smaller and smaller pieces of a larger and larger pie, the job became less and less fun. And when a 19-year-old Bill Gates appeared on the scene in the mid-'70s, admonishing the Homebrew hackers to stop "stealing" his BASIC programming language, the end of hacker happiness seemed nigh.

But true hackers don't let little things like monopolies or near-infinite complexity stop them. As their first line of defense, they have always sought strength in numbers. In the '50s, programmers like Phil Cramer joined a powerful IBM computer user's "club" called SHARE, determined to pool their resources in order to get their machines working better -- and make IBM dance to their tune. In the '70s, the Homebrew hackers likewise came together, in garages and living rooms, to share their expertise and code in the service of their new, desk-sized, power-to-the-people machines. And from the '90s right on through to a new century, free software hackers have also flocked to one another. The programmer-computer relationship may be an inescapably solitary interaction, but the fight for control over every last bit of digital and silicon granularity requires collective effort.

The free software movement has often been described as being by hackers, for hackers, with the rest of us just lucky beneficiaries of the byproduct of hacker obsessions. Even as the free software movement has been organized and corporatized, at root it's still the same as it ever was -- a movement fueled by tinkerers who are constitutionally unable to allow anything to stand between them and their machines.

It doesn't matter whether the tools of their trade are piles of solenoids or copylefted compiler and debugger programs. Hackers will stop at nothing in their drive to play with their Giant Brains. If that means that along the way they'll build the Internet, unleash the personal computer industry and topple Microsoft, well, so be it: They just want to have fun.

The year was 1950. Barry Gordon sat at a desk, a mechanical Friden calculator to his left, punching numbers. He wasn't alone. The "math section" at the Mutual of New York insurance company included rows of similar desks, all featuring Fridens, clerks and actuarial tables. The job was mind-numbing: Punch a number into the Friden. Multiply it with a number from the actuarial table, read out the result, write it down and multiply it by another number. Over and over and over again.

"I get called in by my boss, one day," says Gordon, a fast-talking New Yorker who grins as he starts into a story he has obviously told many times before. "He was the head of the math section, and he says, 'We have a new IBM 604 electronic calculator, and we'd like one of the actuarial students to learn how to use it. What do you think?' I said, 'Sounds terrific!' I didn't know what the hell he was talking about -- I'm 23 years old, on my first job -- but I said it sounded great."

Just by studying the manual, Gordon mastered the essentials of the 604 well enough that he was able to fix it when it malfunctioned a couple of weeks later, even without ever having previously laid eyes on the machine. Within days, he was transferred into the "tabulating division." He felt blessed.

"We used to sit in the math section," says Gordon, "churning out rates and forfeiture values and dividends and whatnot, and now I'd come across this fabulous electronic machine that does stuff miraculously at lightning speeds, and I'm thinking, I'm in on a revolution! I'm going to free mankind from the drudgery of sitting at calculators!"

There was just one little problem. After setting up the 604 to do whatever particular operation was required, the machine would spit out "reams and reams" of punch cards that in turn would be fed into the "tabulator" which would then promptly produce printouts of neat columns of numbers. But only numbers -- there were no headings, no indications of what the numbers referred to. The tabulator could only handle numbers. To make the information "camera-ready" for later copying and distribution, those headings had to be added.

"So one day I'm wondering, where do those headings come from?" says Gordon. "And one evening, on a break during overtime, I go down to the math section where all the people had been freed from the drudgery of hitting the calculators. And they are sitting there, cutting and pasting headings on the pages to make camera-ready copy! And I thought, this is what I have freed them for? There is something terribly wrong with this whole system! At that point I almost quit computing to do something else."

Mort Bernstein's living room explodes into laughter. Barry Gordon is sharing his story with a roomful of aged programmers -- veterans of the industry who all started out in the early '50s. The breadth of experience shared by the nine men and one woman gathered together in this Santa Monica suburb is impressive. They have literally seen it all.

Today, they trade tips on avoiding Microsoft Outlook viruses, or dual-booting their personal computers with Linux. But their experiences are grounded in the biggest of "big iron" -- cumbersome IBM mainframes like the 701 "Defense Calculator" and its successor, the 704. Most were employed by Southern California aerospace companies or think tanks: RAND, Lockheed, Douglas, North American Aviation. Notwithstanding Gordon's insurance tales, these programmers were accustomed to solving problems of slightly more significance than, say, handling Web site traffic or calculating life expectancies. Irwin Greenwald simulated the H-bomb explosion on Eniwetok at RAND, for example, while Westinghouse's Frank Engel modeled the possible deadly malfunction of a Nautilus submarine nuclear reactor.

Some, like Phil Cramer, are still working -- he runs a company that specializes in software for auto dealerships. Others, like Frank Wagner, the patriarch at this meeting, have been retired for several decades. Still others, like the host, Mort Bernstein, amuse themselves with hobbies: Bernstein is trying to program an "emulator" for RAND's Johnniac computer on his PC, but is finding it to be quite a task. Not only does he have to emulate the Johnniac's processor, but he also has to emulate the punch cards that fed it, and the punch card reader, and so on.

The youngest doesn't look a day over 65, but that isn't slowing anyone down. Jokes, anecdotes about long-forgotten computers and disputes about ancient programming lore zing back and forth. They delight in insulting each other, their long-dead colleagues, and most of all, IBM. Programmers of any age appreciate the art of the deftly delivered sarcastic jab, or, in the parlance of these coders, the "cut-down." When Gordon recalls a going-away party at IBM for a manager who appears to have been spectacularly nondescript, and quotes the parting toast "John, your departure will fill the void that was created when you first got here," the laughter is hardly polite -- it's an uproar.

The occasion is a reunion for former members of SHARE, an IBM user's club. Founded in 1955, SHARE may have been the first official computer user's group ever. Certainly, for a time, it was one of the most powerful.

SHARE was an outgrowth of an earlier collaboration between Southern California aerospace companies called PACT -- the Project for the Advancement of Coding Technologies. PACT's goal was to write a compiler for the 701, IBM's first commercial digital computer. SHARE started out as a proactive measure to prepare for the arrival of the 704. Although the name predated a later reverse-engineered acronym -- "The Society to Help Alleviate Redundant Engineering" -- from the very beginning SHARE aimed to save individual programmers from the sorry fate of writing basic code for essential tools that had already been written by someone else.

"We wanted to do something about this silly business of everybody programming their own square root routine," recalls Wagner, who managed programmers at the Mustang fighter plane manufacturer North American Aviation. The 704, which filled a large, specially built room with its card punch, card reader, printer, CRT, magnetic tape reels, magnetic drum, magnetic core storage, central processing unit and operator console (each of which was a separate machine), was not only hard to use, but also didn't come equipped with much in the way of pre-installed "software." Instead, the 704 arrived with a 103-page "Principles of Operation" manual, an assembler and some very basic utility programs, such as a single punch card "bootstrap loader" to get the machine started.

These huge mainframes were more than welcome to the aerospace companies -- they were essential. In the tense days of the Korean War, the pressure was on the defense industry to keep up with every move made by the Soviets and Communist Chinese. But, as Wagner recalls, it was getting harder and harder to tote up the necessary numbers involved in high-tech aeronautic design. "We were drowning in arithmetic," says Wagner, a genial, albeit occasionally sharp-tongued man treated by the other programmers with a mixture of respect and friendly deference. "Whenever an aircraft design changed, you had to go way back and start all over again."

Cooperation, for the purpose of the elimination of "redundant" effort, was the order of the day. The companies didn't share all their software -- they kept their structural analysis programs to themselves. But they did share the tools that they used to build such programs. Such cooperation only made sense to programmers who hated wasting their time, and who, according to Gordon, "tended not to be company loyal or commercially oriented."

"They were loyal to their profession," says Wagner.

And their drinking buddies. Certainly, as an example of programmer pragmatism prefiguring such open-source standbys as the Apache Web server, or Linux itself, by decades, SHARE is historically noteworthy. But SHARE wasn't just about saving money and time -- it was also about having fun with your community. Going to SHARE meetings was a blast, and not least because every night there was an open bar "SCIDS" meeting: The "SHARE Committee for Imbibers, Drinkers and Sots." SCIDS was where the action was, where the "technical" people gathered to lubricate themselves on alcohol and algorithms. As John Backus, the principal author of IBM's Fortran programming language, noted during a 1980 commemoration marking 25 years of SHARE, "There were two principal pleasures of SHARE: Blasting IBM -- giving them hell for fouling up and not giving them what they wanted -- and the second activity, known as SCIDS."

Programmers labor under a stereotype that maligns them as anti-social shut-ins. But while it is true that the act of programming is solitary toil, programmers are also intensely social. It's hardly an exaggeration to suggest that the Internet was built mainly so that programmers would have a place where they could get together and chat about their favorite science fiction novels; but before the Internet, you had to gather at conventions, or conferences. That's where you shared notes with your colleagues, that's where you learned your craft and honed your programming chops. That's where you figured out how to fix your Giant Brains.

Because even if building your own Giant Brain is fun, it sure isn't easy. It's hard work, and you're going to need help. And you're not going to care if the guy you're knocking back whiskeys with, discussing the finer points of matrix inversion, works for your competitor. The point is to get the machine to function properly, to get the buzz that comes from observing the translation of your will -- your computer program -- into action.

These early programmers were insatiably curious tinkerers and inventors eager to dissect every function of their machines, and to learn every clever workaround or deft coding maneuver that would help them get their job done. The advent of proprietary code that could not be tinkered with, could not be taken apart and reconfigured like a bunch of Lego blocks, was extraordinarily annoying. When they started to hack, whether as teenagers wiring together solenoids, or as defense contractors responsible for fending off the Russkis, hardware and software were both just means to an end. Often, in order to change the programming in one of the early computers or tabulating machines, you had to rewire the darn thing yourself. Typically, this was done by means of a "plug board" -- a panel much like the telephone operator's switchboard with which Lily Tomlin once wrought havoc. If you wanted to change the sequence of operations, you pulled out a plug connecting a wire to another point and plugged it in somewhere else.

It was only later, as software became increasingly less physical, that it was even possible to obstruct programmers from handy access to digital innards. And for the first decade or so after the introduction of the commercially sold computer, few people even realized why that would be advantageous.

"There was no economic value in software," notes Bernstein. "No one recognized software as an economic entity at the time."

IBM's domination of the computer market partially explained the failure to see software as a revenue generator. IBM was opposed to selling anything. Its business model was based on leasing hardware. Leasing made it easier to plan for the future and calculate ongoing revenues. Hardware, software, support personnel -- it was all leased, all bundled together in one monthly package. The inclusion of the software was a carefully thought-out strategic measure that locked customers into IBM. If you used IBM hardware, then you used the software that came with it. And if you spent thousands of hours mastering that software, then when it came time to renew your lease, you stayed with IBM -- why on earth would you want to invest thousands more hours getting up to speed on someone else's software, even if new hardware from another vendor might be technically superior?

Bundled software didn't automatically mean bundled source code -- which was one reason that a group like SHARE was necessary. In addition to providing a means for IBM users to share the code that they wrote individually, SHARE was also an important lobbying tool for getting IBM to make changes, add features and otherwise respond to customer concerns. Especially during the early years, SHARE's membership represented a majority of IBM's most important customers. When it complained, IBM listened. It was a user's "club," explains Frank Wagner -- in the sense of the "club" being a large stick useful for beating IBM about the head with.

During their reunion, the SHARE programmers talked about IBM in precisely the same way that hackers today rail against Microsoft. IBM was the eight-gazillion-pound gorilla. IBM pioneered FUD and the practice of "vaporware," forced bad technology down customer's throats and got away with it all because it owned the market. Even the SHARE members who at one time worked for IBM -- such as Gordon, who put in 25 years at Big Blue, and Bernstein, who did a consulting stint there -- joined the IBM derision. IBM got in the hackers' way.

But in one of the great unpredictable ironies embedded in the history of software, IBM's decision to stop bundling its software with its hardware turned out to be the catalyst that really launched the commercial software industry -- and made proprietary control of source code start making sense to the corporate computing world. In 1969, says Mort Bernstein, as the Department of Justice began to prepare its epochal antitrust suit against IBM, IBM decided to engage in some proactive defensive maneuvers. By unbundling IBM's software offerings, and charging for them separately, IBM hoped to avoid the accusation that it was unfairly leveraging its monopoly control of the market for mainframes.

And in a flash, the commercial software industry was born.

"There was no software industry to speak of until that moment, and then it began to burgeon," says Bernstein.

IBM's act of unbundling presents an intriguing contradiction. Opening up competition undoubtedly contributed to faster growth and greater opportunity for companies looking to make a profit in the world of computing, with a consequent increase in the number of jobs for programmers. But individual programmers ended up becoming further divorced from the hands-on finagling with bits and bytes that made their jobs fun. Not only were they denied access to the source code of commercial programs, but their own responsibilities were steadily being curtailed.

Fun, for programmers, was on the run. As programming tasks became larger, employing thousands of programmers at a time, programming duties became more and more tightly segmented and regimented.

"Programming went from an individual craft to a 'professional' activity that had to be managed," says Bernstein. "As computing became a fundamental necessity in every aspect of commerce and industry, more needs arose. Programmers were asked to provide reasonably precise estimates of the time and resources that would be needed for each development. Based on this, budgets and schedules were produced. Management expected the result to match the estimates within reason. This is a real sea change, from a freewheeling craft performed by 'artistes' to a tightly managed activity supposedly performed by 'professionals.'"

To some observers, the changes in the industry could be placed in the broader, and more suspicious, context of the move to "scientifically manage" office work of all kinds. Following principles established by Frederick Winslow Taylor, an inventor who influenced the creation of assembly-line manufacturing, management began treating programmers as if they were the plugs to be moved around the plug boards, rather than creative visionaries with minds of their own. As Joan Greenbaum, a former IBM programmer, writes in "Windows on the Workplace":

"The first step that management took to gain control over the programming workforce was to divide the conceptual work of programming from the more physical tasks of computer operations. Although this division was put into effect in the aerospace industry in the mid-1950s and subsequently used by companies that had defense contracts, it wasn't until the mid-1960s that it spread elsewhere. By 1965, when IBM began installing the general-purpose System 360, both the more expensive hardware (a large mainframe computer) and the easier to use software (an operating system that could be controlled through commands rather than operators working switches), gave upper and middle managers room to begin enforcing the separation of programming from operations. Operators were to stay in the 'machine room' tending the computer, while programmers were to sit upstairs and write the instructions. Those of us in the field at the time remember feeling that a firm division of labor had been introduced almost overnight."

Proprietary code. Tight divisions of labor. An end to the freedom and fun of the golden age. By the late '70s, things had become pretty bleak for the corporate programmer.

But help was on the way.

"Then the PC came along and the ball game changed again," says Bernstein. "Who has to account for the time they use on their PC? No one! It's an appliance on my desk that is of the same nature as my telephone. And the world of the programmer had come full circle. The PC is what makes the open-source community as freewheeling as it is. Linux and open source could never exist in the world of tightly controlled corporate computing."

It is not every day that you meet someone who has an antique rubidium atomic clock ticking away the minutes in his study. What's even more unusual is when the clock -- a fairly large timepiece, about the size of an electric dishwasher -- doesn't even stand out. Among the old DEC minicomputers, teletype machines and obsolete tape drives that line the walls of Bob Lash's back room, the atomic clock, which was once a fixture at Californias Vandenberg Airforce Base and is practically indestructible, is almost an afterthought. Compared to the home-made computer that Lash built while going to high school in the '70s in Palo Alto, it is hardly remarkable at all.

Bob Lash enjoys the honor of being one of the youngest attendees at the first meeting of the legendary Homebrew Computer Club in Gordon French's garage on March 5th, 1975. That alone is testament to some sublime geekiness. Silicon Valley's Homebrew Computer Club is a staple in the annals of computing history. Two of the best books about the history of the personal computer, Steven Levy's "Hackers" and "Fire in the Valley," by Paul Freiberger and Michael Swaine, devote hundreds of pages to Homebrew. Steven Wozniak, co-founder of Apple, is a Homebrew alumnus, as is Lee Felsenstein, the designer of two earl, and beloved personal computers, the Sol and the Osborne. The role of the Homebrew hackers in creating the personal computer, in delivering the power of a Giant Brain to you and me, is difficult to overestimate.

But Bob Lash's study takes personal geekiness to new heights. The study could be mistaken for a museum of computing, although it is actually anything but. The oscilloscope, soldering iron and bins of electronic parts offer a clue. These relics, the DEC PDP-11/23 and PDP-8/I, are in working order. This room is a living shrine, proof that Lash, a programmer who is very much alive in the present -- his luxurious home in the hills above Redwood City was paid for largely by the sale of a Web chat software company that he and another Homebrew alumnus founded -- hasn't lost sight of his past.

Some of the artifacts in this room are part of an attempt to re-create Lash's very first encounter with a computer, when he was only 6 years old. He was taking part in an experiment at Stanford aimed at studying the teaching of mathematics to young children. After one of the sessions, he was allowed a peek into a nearby machine room -- "Basically, a room full of cabinets, with a console filled with blinking white lights, and there was a tape drive with a pair of reels sitting on it: a DEC tape drive. It made a huge impact on me. There was a fellow there and he pointed to a key on a console and told me to push it, so I did, and when I did, suddenly a pair of reels started spinning. The thought that you could hit a key on the console and then make something else happen, that you could control something remotely by a computer, was to me an entirely new idea."

I observe to Lash that it is not uncommon for 6-year-olds to be confronted with "entirely new ideas." He smiles, but it is nonetheless clear that the encounter was life-transforming. Right next to Lash, as he speaks, sits a DEC tape drive, not quite the same model as the one he encountered as a youth, but close enough. The teletype machine connected to the PDP-8/I is also nearly identical to the one he recalls. More, perhaps, than any programmer I have ever encountered, Lash has concentrated his intellect and his skills on capturing the reality of his own human-computer interaction. Barring cyborg surgery, he is a man who has gotten as close to the machine as possible.

And like his fellow Homebrew hackers, he's convinced that everyone should share the joy. In his study there are also four personal computers. One old 486 IBM clone runs Linux and is hosting a Web server. It is also connected to the teletype machine that feeds instructions to the PDP-8/I. Once Lash straightens all the bugs out, Web surfers will be able to run their own old PDP code on Lash's machine, if they still have any, or test out any new code that they might want to take for a spin.

But why would anyone want to do that? The PDP-8/I was popular in its time, but compared to today's computers, it's a joke -- a three-year-old personal computer is about a thousand times as powerful. Sure, it looks cool, with its flashing lights and switches and knobs, but it's not really what one would call a high-powered productivity tool.

But Lash has his reasons: "What I'd like to do is give younger programmers the chance to get some hands-on experience at communing with the machine at its deepest level," he says calmly. "With today's processors you really don't have a feeling for what's going on inside, they're like black boxes. But in a machine like the [PDP] 8 everything that is happening inside is shown on the indicator lamps, all the registers are displayed and you can actually get right down to the register level and understand what's really happening step by step. Nowadays there is almost no ability for people to do that any more. They've lost touch with that."

Right down to the register level -- the places within a computer's central processing unit where individual bits of information (or the location of those bits of information) are stored. Berkeley's "Giant Brains" book laid out the workings of the computing machine; Lash wants to make those workings transparent to a Web-enabled public. The benefit, he argues, is better software.

"I think that programmers who can see what is happening all the way through down to the deepest levels of the machine have the best understanding of what's really happening," says Lash, "and that has a big influence on how they think about what they are going to do and how they are going to do it. In this age of code bloat and inefficient systems and sluggish Windows machines that are shimmying and shaking and smoking, I think that if today's programmers had a better understanding of what's happening under the hood they could and would build more efficient systems."

But again, it's not just about creating technically superior software, it's also about connecting to the fun that's at the heart of the computing experience.

"I think having an appreciation and an understanding of what's happening is helpful, but even more important than professional benefits, I think that for young programmers, it fires up a sense of enthusiasm and wonder for what they're doing. It isn't just a dull boring job -- 'Let's crank out another application because we have a deadline.' It's really a miracle when you see what happens, and it's a wonder. A kind of spark and spirit can be of enormous benefit to the work that you are trying to accomplish. I guess it's more about motivation than any real technical point."

Lash recalls attending Homebrew meetings with exactly the same mixture of nostalgia and glee that the SHARE programmers remembered experiencing at their SCIDS meetings. His desire to understand his computers, and his belief that everyone would benefit from a similar understanding, is also directly connected to the explicit free software ideology that has sprung up in computing in more recent years. Lash is a believer in Linux-based operating systems -- as is his 11-year-old son Elliot, who, when he isn't playing Starcraft, is running a Linux-based operating system called Phat-Li nux.

For Lash, who reverently shows me a loose-leaf binder filled with hundreds of pages of all the lines of code he wrote two decades ago for his home-made computer, the code is the machine.

Most of the Homebrew hackers felt the same way. Which might just explain why they got their dander up when Bill Gates wrote them a nasty letter in 1976.

On February 3, 1976, less than a year after the founding of the Homebrew Computer Club, a 19-year-old named Bill Gates, who titled himself General Partner of a then little-known company called "Micro-Soft," wrote a screed titled "An Open Letter to Hobbyists."

Bill Gates and his partner Paul Allen had written a version of the BASIC programming language for one of the first mass-produced personal computers, the Altair. Without it, there wasn't a whole lot that you could do with an Altair. But to Gates' dismay, he had discovered that less than 10 percent of all Altair owners were paying for a copy of his BASIC -- instead, hackers were making their own copies and giving them away. To Gates, this was outright thievery.

"As the majority of hobbyists must be aware," wrote Gates, "most of you steal your software. Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid?"

"Who can afford to do professional work for nothing?" continued Gates. "What hobbyist can put three man years into programming, finding all bugs, documenting his product and distribute for free? The fact is, no one besides us has invested a lot of money in hobby software ... Most directly, the thing you do is theft."

Like the Homebrew Computer Club, Gates' letter is an indelible icon of computer history. For some, it marks the birth of a billionaire, outlining in bald terms the psychology of one of the most successful businessmen of the 20th century. For others, it foretold the death of the hacker dream that information should be free. And to still others, it marks the line drawn in the sand between the world of free software and the empire of Microsoft. Want to know why so many hackers despise Microsoft? Read the letter.

But didn't Gates have a reason to be angry? Weren't the Homebrew hobbyists stealing BASIC? Well, in a sense, sure.

"We would say 'bring back more copies than you took,'" recalls Lee Felsenstein, remembering Homebrew Club meetings held in the Stanford Linear Accelerator Center (SLAC) auditorium. "Altair BASIC showed up first as a paper tape that had been ripped off or liberated in late '75 and was being passed around or copied -- a teletype could copy it. Sometimes I would hold up the black board pointer and say 'put them here,' and people would skewer their rolls of tape on the pointer."

I meet Lee Felsenstein, designer of the Sol and Osborne personal computers, in a nearly unfurnished office in Palo Alto which he is using as a temporary work space while he looks for a new job. Just a few weeks earlier, his previous employer, the Paul Allen-funded Interval Research think tank, closed its doors. While he negotiates a purchase involving his credit card information on his cellphone, I examine the only part of the room that shows signs of life. On a large workbench stretching along one wall are the tools of his inventor's trade: a fancy digital oscilloscope, a heat gun, a power supply, and trays upon trays of silicon chips: the basic building blocks of the modern computer.

By now I have read "Giant Brains" and have learned the basics of how information can be moved from register to register, and what kind of operations can be enacted on that information. At Bob Lash's home, I have had the chance to examine both his homemade computer and the reams of code he wrote to make that computer work. Now, I'm looking at a more contemporary version of the same concept, the silicon-based chips that embody those operations. And I am realizing: hardware, software, chips, wires -- for me too, it is becoming difficult to make any clear distinctions. And as I probe Felsenstein's memory of the letter from Bill Gates, I begin to see exactly why the Homebrew hackers got so mad at being called thieves even as they blithely admitted that they were copying someone else's copyrighted software.

Did Felsenstein remember the moment he read that letter?

He rolls his eyes.

"Yes, absolutely," says Felsenstein. "I read it aloud from the floor of the Homebrew Computer Club. To great derision. I read it to the multitudes assembled in the SLAC auditorium. Everybody thought that was hilarious, and they were damned if they were going to send them $500."

Felsenstein draws a detailed picture of the moment. The SLAC auditorium holds 275 people, he says, but it was only about two-thirds full.

"The people involved were not at the top of any stratum," says Felsenstein. "They were second string and below ... They were people who wanted desperately to have access to computers. The majority of them worked in the electronics industry in Silicon Valley, and some in the computer industry, but they were not permitted access to computers at their work."

So when MITS, a company in Albuquerque, New Mexico, announced the arrival of the Altair, one of the very first inexpensive personal computers, they jumped at the chance to buy it. And then were almost immediately disappointed. The Altair was little more than a box of parts that barely worked. For one thing, it came with no devices for getting information into and out of it, which meant that the Homebrew hackers were faced with quite a bit of extra tinkering, at extra cost.

"At the time when the Altair personal computer was being delivered it was found to be difficult to get it running," says Felsenstein. "It was poorly designed in several ways. And once you finally got an Altair hooked up together, it didn't do anything."

"The machine is not complete until the software defines what it is doing," says Felsenstein. "I view software as another component of the system. Our view of this was, this is like the last part of the machine that everybody sweated bullets to not only buy but to learn how to put it together and so forth, and what the hell, what business do they have trying to jack us up for another $500? This makes the thing go and I want to make it go ... It seemed like a kind of a bait-and-switch at the time to say 'You can buy a computer for $297 but oh, sorry, all you get is a bunch of parts that don't give you any I/O [input/output] and oh, sorry, once you get that set up at great cost to yourself you don't get any software that lets you do anything but you can pay more for that."

"We didn't really know who Bill Gates was," continues Felsenstein. "He seemed to be involved with MITS. He started out as a MITS employee ... As far as we could tell that's all that he was. Somebody had come along with a BASIC and attached themselves to MITS and said 'Now we can really clean up because these bozos don't realize they need software for their boxes.'"

"And then he comes in with this letter," says Felsenstein, "saying 'we haven't gotten the kind of money we wanted and you guys are all crooks -- most hobbyists steal their software.' Well, you know, for people who were out about ten times what they thought they were going to be out when they answered the ad for the Altair, the concept of thievery is a little different. [We said] 'You guys in effect stole our money, and now you want another $500? I'm sorry, we want our computers to work.' The general feeling was, let them ask nicely, don't call us crooks to begin with, we see who the real crooks are."

The letter, says Felsenstein, drew a line that could never be erased. Once and for all, Bill Gates declared that he was not a hacker -- that, on the contrary, hackers were his enemy.

"The oppositional stance kind of presaged everything else," says Felsenstein. "We wanted somebody to say, 'Look, I am one of you and here is what is going on.' He removed himself from our society and our culture with that letter and with his subsequent actions."

I ask Felsenstein if he recalls the reaction of the Homebrew audience the moment he read the line "most of you steal your software."

"There was much hooting," says Felsenstein. "We had a good time laughing at that letter. And as I like to say now, what a shame that Bill Gates didn't get his money. He could have been a contender. "

Pandemonium reigns at the Four Seas Restaurant in San Francisco's Chinatown on the evening of June 15, 1999. The line to get in stretches down the stairs from the second floor, out the front door, and spills onto the sidewalk. The buzz of conversation is half-anxious, half-excited -- it's as if the assembled crowd is waiting to see some hot new band, but isn't sure that there's enough room for everybody.

They haven't come for the food. They have come to attend, of all things, a Bay Area Linux User's Group [BALUG] meeting. Of course, it's not your average BALUG event. While the meetings have become steadily more crowded over the past few years, in close correspondence to the rise in popularity of Linux-based operating systems, they aren't usually this kind of a madhouse. But tonight, Linus Torvalds has come to speak, and in June of 1999 in San Francisco, that means a major techno-cultural media event is at hand.

By the spring of 1999, Linux fever is reaching hitherto unimaginable heights. One distributor of Linux-based operating systems, Red Hat, has just filed to go public, and others are soon to follow. Everyone is in love with Linux -- Wall Street, the press, the pundits. On this particular evening, 400 people will stuff themselves into the Four Seas Restaurant, in part to hear Torvalds, but also to celebrate their own great good fortune. Programming is fun again, and free software is a big reason why.

It's actually kind of a shame that Torvalds' appearance has skewed the BALUG meeting from its normal, less media-friendly fare. These meetings follow directly in the tradition of both the Homebrew Computer Club and even the SHARE IBM user's club -- although few, if any, of the attendees have ever heard of SHARE. It's time to get together, drink some beer, chow some middling Chinese food, and talk shop. Figure out how to get your system working, how to get your giant brain tuned and optimized the way you want it, without any taint of proprietary software obscuring the machine's inner mysteries. At closely related "installfests" organized by various LUGs, you can even bring your computer and get experts to walk you through the often torturous process of getting a free software operating system running correctly on it.

This is the kind of forum in which Torvalds excels, where he can engage in his favorite interactive question-and-answer format with technically knowledgeable people. I have seen him speak to crowds numbering 20,000 and higher and field reporter's queries at press conferences with aplomb, but he is most comfortable when being asked detailed questions about arcane aspects of software and hardware. Torvalds has a way of being both self-deprecating and utterly sure of himself that works well with audiences. When he jokes that "I am basically lazy" or that "if I don't understand something, I think it is bad," he gets a big laugh, but no one underestimates him.

Sitting at one of the big round tables that dot the banquet floor, listening to a smattering of hackers exchange jokes and chatter about the various Linux happenings of the day, I'm impressed at the general sense of good feeling that flows through the room. People are happy to be here. They're happy to be talking about obscure matters of symmetric multiprocessing performance and source control management. They're especially happy to be doing what they do, which, more and more, means hacking on Linux-based operating systems. A year later, when I hear Lee Felsenstein muse about what makes open-source software fun, I immediately think back to the night at the Four Seas.

"The open-source movement is a direct descendant of the Homebrew Computing Club," says Felsenstein, "motivated by the same things -- that very seductive goal of creating what never has been before, and doing it in a sharing community where coercion is absent and the joy and the beauty of the creativity is manifest in everyone. It's very, very powerful."

It's the same fun that is recalled by SHARE programmers when they reminisce about SCIDS meetings. But there's one big difference.

Torvalds gets one of his biggest laughs of the night when he makes a glancing reference to the commercialization of Linux. Referring to Linux hackers, he says, "They can laugh at the stupid company who will pay them for what they would do [anyway] for free."

Few audiences could be more receptive to his quip than the hackers assembled in front of him. The evening is being sponsored by two companies, VA Linux and Linuxcare, which have for the past six months been hiring every member of any Linux User's Group they can get their hands on. Not many of the hackers may have imagined when they first started coming to the meetings that their hobby would suddenly make them highly sought-after professionals, but that's certainly the case now. Linux hackers are hot. Who can afford to do professional work for nothing? Right now, a good many companies are trying to figure out new answers to that exact question.

The fun is back in programming. The industry has indeed come full circle. Once again, hackers are being paid to do just what they want to do -- only this time around, everything they produce is made freely available to the general public.

In June of 1999, or June of 2000, for that matter, it's impossible to say how long this hacker-blessed happenstance will continue. Perhaps the present period is a new golden age of hacking doomed to decline or morph into some less blissful stage. And of course, not all jobs at Linux-related companies are ideal, nor is there any certainty that giving software away will wrest control of the computing industry from Microsoft. But for the hackers twiddling their chopsticks, happily continuing a cherished tradition of community and mutual self-help, such quibbling about what has not yet come to pass is a waste of time.

"There are a great many things that all of us could do much better if we could only apply what the wisest of us knows," wrote Edmund C. Berkeley in "Giant Brains, or Machines That Think." His point was an argument for the worth of the computer as a concentration of human knowledge made available to all humans. But it also can be taken to represent the value of the people who create those computers, who work together to share their wisdom. All of us could do much better, he seems to be implying, if we are given the freedom to SHARE.

- - - - - - - - - - - - - - - - - - - - -

Read Chapter One of the Free Software Project

Join the discussion on this chapter


By Andrew Leonard

Andrew Leonard is a staff writer at Salon. On Twitter, @koxinga21.

MORE FROM Andrew Leonard


Related Topics ------------------------------------------