Near the start of “A Bug’s Life,” this season’s computer-animation blockbuster, the dejected hero — an innovative ant named Flik whose ahead-of-their-time ideas keep backfiring — breaks down and cries: “I’m never gonna make a difference.” Halfway through the movie, when Flik’s grand scheme to save the colony has gone awry, he repeats: “I just wanted to make a difference.” The ant queen sternly reminds this renegade that “it’s not our tradition to do things differently” — but thinking differently wins out in the end.
Steve Jobs’ name isn’t prominent in the credits of “A Bug’s Life” — but the film was produced by Pixar, a company Jobs has owned since 1986. It’s no coincidence that the current slogan of that other company Jobs now runs, Apple Computer, should coil through “A Bug’s Life” like a fiber of marketing-speak DNA. “Think different” isn’t just an ad line: It’s been the persistent mantra of Jobs’ extraordinary career of visionary breakthroughs, appalling hubris and unpredictable comebacks.
“Making a difference” is the ideal that has led Jobs to his greatest achievements and most profound failures. He goaded the original Macintosh team by promising them that they would “put a dent in the universe.” He lured Pepsi CEO John Sculley to Apple in 1983 by asking him, “Do you want to spend the rest of your life selling sugared water or do you want a chance to change the world?”
Difference — innovation that simply wasn’t available from other computer companies — transformed Apple into the powerful icon it remains today, both in its own industry and in the broader world of American business. Difference — incompatibility with existing systems in which people and companies had invested too much time, energy and resources to abandon — also trapped Apple in a small niche even as the computer marketplace exploded around it.
To understand Jobs’ thirst for difference, you have to look back to Apple’s creation myth — the Two Guys in a Garage tale that continues to fuel Silicon Valley dreams. The pair’s roles are clearly defined in the saga: Steve Wozniak was the Hardware Guy, whose elegant circuit-board designs for the Apple I and II still cause engineers’ eyes to mist over; Jobs was the Marketing Guy, who understood that Woz had a product that could sell — especially if you put it in a then-sleek plastic-mold box.
Without Jobs’ marketer’s eye, Apple would never have made a noise, let alone a difference. But in Silicon Valley, even more in the mid-’70s era that spawned Apple than today, marketing was something at which engineers turned up their noses. Engineers design chips and write software; they make things that people use, things that matter, things whose value can be objectively determined: Does it work? Is it fast? Is it cheap? Marketers, on the other hand, deal in people’s subjective perceptions, the intangibles of taste and image and brand — slippery things that an engineering culture distrusts. On the Silicon Valley totem pole, great engineers are the people who make a difference; great marketers just make money.
For Steven Paul Jobs — born Feb. 24, 1955, a ’60s-style seeker who grew up in the heart of the Valley, dropped out of Reed College, designed video games for Atari and then traveled around India before founding Apple with Wozniak — being dismissed as a Marketing Guy must always have stung. And in many ways the rest of his career can be seen as an effort to shed that label.
Jobs has always aggressively and ostentatiously shunned the corporate suit. A vegetarian who clinched a 1997 entente with Microsoft on a barefoot walk through Palo Alto, he still prefers a uniform of black turtleneck and blue jeans. His most famous marketing coup — the ad that introduced the Mac, in which a revolutionary individualist wields a sledgehammer to liberate a world of corporate drones — said “up yours” to corporate America. And it did so with such adroitness that Apple’s directors pulled it after a single showing during the 1984 Super Bowl.
From Apple’s early days in his family’s garage, Jobs set out to prove his mettle as a techno-visionary whose concerns and talents rose high above the messy muck of the marketplace. If Silicon Valley is full of passionate code slingers and chip creators imbued with the notion that their work is a higher calling, Jobs is responsible for giving them a high-profile role model. As Robert X. Cringely puts it in his “Accidental Empires,” “[Bill] Gates sees the personal computer as a tool for transferring every stray dollar, deutsche mark and kopeck in the world into his pocket,” whereas “Steve Jobs sees the personal computer as his tool for changing the world.”
Jobs is, in fact, so closely identified with Silicon Valley’s revolutionary techno-messianism that you can’t help wondering whether it’s a case of protesting too much. His famous challenge to Sculley, then, may not have been a simple invitation to join the cause; it also perhaps reflected Jobs’ own fear that he, too, might go down in history, like Sculley, as “just” a Marketing Guy — the computer-industry equivalent of a sugar-water peddler.
There was only one way for Jobs to avoid such a fate: He had to create something “insanely great” himself — great enough that he’d be remembered as its father rather than Woz’s salesman. That creation was, of course, the Macintosh.
But the Macintosh didn’t begin as Jobs’ baby; the revolutionary computer was an adopted child, like Jobs himself. (As an adult he discovered that his sister is novelist Mona Simpson, who loosely based the central character of her most recent novel, “A Regular Guy,” on him.) The natural father of the Mac was an Apple employee named Jef Raskin, who dreamed of building an easy-to-use, cheap computer, an “information appliance” for the masses. Raskin’s ur-Mac used a low-powered processor and had no mouse. The project kept getting canceled by Apple honchos, who were focused on the Lisa — Apple’s first windows-icons-and-mouse computer, inspired by what a Jobs-led Apple team had seen on a 1979 visit to Xerox’s legendary PARC lab, where computer scientists were cooking up the interfaces of the future. But when the expensive Lisa failed to win over its intended business customers, the Mac borrowed a lot of its ideas and became Apple’s last hope and savior.
Jobs, who’d been removed from the Lisa project by Apple’s suits, had elbowed Raskin aside and taken over the Mac team. He egged them on with slogans like “The journey is the reward,” “true artists ship” and “it’s better to be a pirate than join the navy.” He hung a pirate flag over the Mac developers’ offices. (The tale is told in full in Steven Levy’s superb history of the Mac, “Insanely Great.”)
Building a computer, as Jobs himself has repeatedly declared, is a team effort. Jobs didn’t design any part of the Mac. Its hardware was largely the creation of Burrell Smith; Andy Hertzfeld wrote much of the key software; other Mac team developers made central contributions. To what extent, then, can and should Jobs take credit for “the computer that changed everything”? If he is not an engineer and not a marketer, what is Steve Jobs?
Is he an “artist,” as he has sometimes called himself, with more than a little pretension? More apt, perhaps, is the role of “impresario” that his biographer, Jeffrey Young, identified in a 1989 New York Times profile. (“He goes out and finds new technologies like they were rock ‘n’ roll bands.”) But the most useful way to understand what Jobs does best is to think of him as a personal-computer auteur.
In the language of film criticism, an auteur is the person — usually a director — who wields the authority and imagination to place a personal stamp on the collective product that we call a movie. The computer industry used to be full of auteurs — entrepreneurs who put their names on a whole generation of mostly forgotten machines like the Morrow, the Osborne, the Kaypro. But today’s PCs are largely a colorless, look-alike bunch; it’s no coincidence that their ancestors were known as “clones” — knockoffs of IBM’s original PC. In such a market, Steve Jobs may well be the last of the personal-computer auteurs. He’s the only person left in the industry with the clout, the chutzpah and the recklessness to build a computer that has unique personality and quirks.
Indeed, Jobs’ products have always been a reflection of his personal taste, resulting in both stunning breakthroughs and marketplace disasters. The original Macintosh contained no fan because Jobs hated the sound — but that meant you couldn’t put a hard disk drive in its case. Its keyboard lacked cursor keys because you were supposed to be using the mouse, anyway. There were no expansion slots to add hardware — why would you want to mess with perfection? Some of Jobs’ choices were stubborn and misguided; others, like the Mac’s reliance on three-inch floppies in hard plastic cases, were so smart they created new standards. Alone among the computer industry’s entrepreneurial godfathers, Jobs always paid attention to his products’ appearance: Whatever its failings, a Jobs computer has always had a distinct look that shouts, “Someone fussed over me.”
Aesthetic design goes hand in hand with marketing, of course, and Jobs’ obsession with the surface of his products has probably bolstered his image as a Marketing Guy. But Jobs has never taken the scientific MBA-style approach of market research and demographic targeting; he follows his nose like a true connoisseur. In “Insanely Great,” the designer of the Mac II says: “Steve Jobs thought that he was right and didn’t care what the market wanted — it’s like he thought everyone wanted to buy a size 9 shoe.” In his defense, Jobs has told interviewers over the years that, in a fast-evolving industry like computers, you can’t just “give people what they want” because people don’t necessarily know what they want — and what they tell you they want today may not be what they actually want at the end of the two years it takes you to build it to their specifications.
If Jobs is able to imprint his personality on computers, it’s little surprise that those computers should, like any human personality, contain contradictions. Every one of Jobs’ projects has created its share of paradoxes. The Mac was supposed to be the “computer for the rest of us” — yet it wound up in a narrow niche. Strangely, the machine that held the promise of making computing less arcane and more approachable for the masses instead acquired the image of an elitist gadget — a computer for people who think they’re better than everyone else. You can blame this partly on the Mac’s consistently higher prices, relative to clumsy but functional DOS- and later Windows-based PCs — but even more on the ideology with which Jobs imbued the Mac from Day One. Most people don’t want to be pirates; for better or worse, they want computers to process words and crunch numbers, not to change the world. Apple’s “difference” became an albatross in a world where buyers demanded compatibility. And the fight-the-power rhetoric that fired up the Mac loyalists made Apple a tough sell to the Fortune 500.
At Next, the company Jobs founded after Sculley ousted him from Apple in 1985, Jobs set out to build the ultimate workstation for scientists and scholars. The Next’s cubic black-magnesium box looked cool; its optical drive was a bold bet on the future, as was its modular object-oriented software. But by the time it came to market in 1989, its $10,000 price tag meant it had to be repositioned as a business computer — and by the late ’80s businesses weren’t flocking to Jobs’ visionary schemes. Next floundered until Jobs jettisoned its hardware business and repositioned it as a software company.
The contradictions continue right up to the present. It was Next’s software that provided Jobs with the key back to his Apple kingdom at the end of 1996, when Apple’s then-CEO, Gil Amelio, was desperate for a new operating-system strategy. Jobs sold Apple the entire company for $400 million. “Apple is buying Next, not me,” he declared, insisting that his primary allegiance remained with the fabulously successful digital animators at Pixar, which he’d bought in 1986 and carefully nurtured to its “Toy Story” triumph.
But six months later, Amelio was out and Jobs was the new “interim” CEO — the title he holds to this day. Meanwhile, as key Jobs lieutenants from Next filtered into Apple’s top ranks, the original operating-system strategy built on Next’s software has been radically revised to provide more continuity with the existing base of Mac software. Instead of Apple buying an operating system from Jobs, Jobs had engineered what stunned observers dubbed a “reverse takeover” of his old company — and walked off in the deal with 1.5 million shares of Apple stock. (He sold them off at a critical moment during Amelio’s last days; having made over $1 billion in the initial public stock offering for Pixar, he may have viewed this as chump change.)
Whatever the specifics of the boardroom maneuvering that returned Jobs to Apple’s helm, it’s easy to believe Amelio’s claim, in his whiny memoir, “On the Firing Line,” that Jobs seduced and betrayed him. It wouldn’t be the first time in Jobs’ career that he’d be accused of such behavior; Silicon Valley is littered with embittered former collaborators who warn of his “reality distortion field.” Working for Jobs is, they say a cultlike thing: Buying into his vision is “drinking the Kool-Aid,” and “you only drink the Kool-Aid once.”
In a 1985 Newsweek interview, Steve Wozniak accused Jobs of a deep duplicity: “Even in personal conversations with the guy, you could never really tell what he was thinking. You might ask him a yes-or-no question, and the answer said ‘no’ to anyone who heard it but really means ‘maybe yes, maybe no.’ … Aside from not being able to trust him, he will use anyone to his own benefit.” Mac software pioneer Andy Hertzfeld similarly told Fortune in 1988: “He’ll do anything to get his way. When he’s trying to convince you to do something, he’ll take one tack, then switch.”
Such comments, of course, are often strewn in the wake of successful empire-builders. What’s different about Jobs is that he has always seemed less interested in building institutions than in overturning established orders. He is a bundle of paradoxes: A manipulative cult-of-personality leader, he also brings egalitarian principles to his workplaces (Next’s open-information policy extended to everyone’s salaries); he longs to unleash the chaos of new technologies — as long as he can dictate the shape of their boxes and the color of their power cords. He is, it seems, a revolutionary control freak.
Given this heritage, what’s most remarkable about his latest comeback at Apple — where he has restored the company’s profitability, renewed excitement among its following and begun to make sense of its product line and strategy — is that he is winning kudos as, of all things, an effective manager. Where once his “Management By Walking Around” terrorized underlings with capricious put-downs, the word now is that Jobs — middle-aged, married and a proud paterfamilias — has learned how to inspire people without insulting them.
Only the current crew at Apple can know whether that is true or just another instance of Jobs’ prodigious ability to spin the press — which has always been an important element of his success. Soon after returning to Apple’s helm in the summer of 1997, for instance, he granted Time an exclusive-access interview; in return, he got a slobbering cover story that portrayed the valuable but humiliating deal he cut with Microsoft as a personal triumph.
Whether Jobs has mellowed or not, what’s evident even from the outside is that his taste in technology — the trait on which he has built his career — has grown more catholic. He seems willing to meet the market halfway today. Could he have finally outgrown his need to prove to the world that he’s not a Marketing Guy? Sure, the iMac has its quirks — the missing floppy drive, the move to the new USB standard for peripherals — but it’s mostly just a state-of-the-art Mac in a pretty new box at a competitive price. And as the sales figures show, that’s just what the public wanted.
Similarly, in the biggest challenge Jobs and Apple still face — the long-promised and long-bungled renovation of the estimable but antiquated Mac operating system — Jobs seems to be moving Apple toward evolution rather than revolution. In ditching Amelio’s clean-break Rhapsody plan in favor of the new OS X/”Carbon” strategy, Jobs has promised the Mac community that the new operating system will not leave old Mac software behind as it brings the Mac onto more modern, less bug-prone ground.
This may be the first time in Jobs’ long career that he has opted for the pragmatic appeal of backwards compatibility rather than the allure of the coolest novelty. The plan doesn’t sound at all like the old Steve Jobs — who’d have cried, “Give people the greatest operating system in the world and they’ll beat a path to your door!” In fact, it sounds a lot like the anti-Jobs himself, Bill Gates — who has brilliantly carried generations of users and software across one technological transition after another, testing their patience with bugs but never losing them to the competition.
In a Washington Post interview in 1992, Jobs said, “If you look at my life, I’ve never gotten it right the first time. It always takes me twice.” At the time, he was talking about Next — which, as it turned out, he never got right. But in his second time around at Apple, it certainly looks like Jobs has learned from his mistakes — preserving his creativity while tempering his cruelty, bringing a gift of imagination to an industry that sorely needs it.
Mellower at 43 and less insecure, Jobs has, perhaps, realized that there are other ways to “make a difference” than those he embraced as a young rebel. After all, “think different” doesn’t have to mean thinking differently from other people. It can also mean changing the way you think yourself.