In one of the most legendary scenes from the annals of Silicon Valley, Apple founder Steve Jobs lured Pepsi president John Sculley to join his young company with the challenge: "Do you want to spend the rest of your life selling sugar water, or do you want a chance to change the world?"
A computer company change the world? In 1983, maybe, you could say that with a straight face. But too many bad quarters since then have eroded Apple's revolutionary pretensions, and no one -- not even Jobs -- could get away with expressing them so rapturously today.
If such aspirations still beat anywhere within the battered heart of the cult of Macintosh, they're definitely out of vogue elsewhere in the merciless high-tech marketplace of the '90s. Real businesses don't try to change the world, Apple's successful competitors sneer -- at Microsoft and Intel and Oracle, the idea has always been to put change in shareholders' pockets.
If a company makes (or loses) billions of dollars, eventually someone will write a book about it, whether the world was changed or not. We're currently awash in a raft of technology-business chronicles -- tomes that profess to take you inside boardrooms to witness the smart deals and dumb blunders that may not have changed the world but definitely made (or lost) vast fortunes. Microsoft books already fill a small shelf; in this season's new batch, reporters take on Intel, Oracle and Apple.
But boardrooms are mostly dull places -- the best business writing goes beyond executive infighting to open windows onto a company's psyche. As you take in these chronicles of bonanzas and blunders, turnarounds and takeovers, you start noticing a pattern: Apple isn't the only cult in this industry. Each of these companies has its own version of the True Faith, with a powerful leader shaping its beliefs and behavior. We've already read about how young executives at Microsoft embrace their boss's lingo (using "random" as a put-down, for instance) and adopt his habit of rocking back and forth in chairs. But at the center of every successful technology company, it seems, there's some Bill Gates-like founder-figureweaving a spell, bending others to his will.
Not every cult is idealistic. The corporation portrayed by Tim Jackson's engrossing "Inside Intel" is the kind of cult that runs its members' lives through small, controlling rituals -- like a notorious "Late List" that forces all workers reporting to work after 8 a.m. to sign their names. Timeliness is the cardinal virtue at Intel, and tidiness is rigorously enforced inside its cubicles. (In fact, Intel was the first company to put all its executives and employees in cubicles, according to Jackson.)
It would be convenient to attribute this kind of micromanagement to the nature of Intel's business, the design and manufacture of microchips. But as Jackson describes the company, the real source of the control-freak stance is chairman Andy Grove -- the Hungarian-born engineer who was Intel's hands-on boss through most of its existence. Here's Grove, a scientific manager known for delivering fierce browbeatings to lowly underlings and top execs alike, in mid-tirade:
Andy Grove was sitting there, holding a stave of wood the size of a baseball bat. At the end of the stave was a hand shape, encased in a protective glove of the kind used inside Intel's fabs [chip factories], with the middle finger extended in an obscene gesture. Grove had just slammed the wood onto the surface of the meeting-room table -- and was now shouting at the top of his lungs: "I DON'T EVER, EVER WANT TO BE IN A MEETING WITH THIS GROUP THAT DOESN'T START AND END WHEN IT'S SCHEDULED."
Even as the book chronicles Grove's stringent management techniques -- like a "Scrooge memo" reminding workers that they were expected to work a full day on Christmas Eve -- Jackson gives Grove credit for much of Intel's phenomenal success in conquering the microprocessor field and adapting to constant change. Intel was not the kind of cult that clung to dogmas that had outlived their usefulness. Though it invented the DRAM (memory chip) business in the 1970s, it gave up on it in the '80s once Asian competitors drove down profit margins in that market. And though it preferred dealing directly with other companies and their engineers rather than the public, the highly publicized Pentium long-division bug incident of 1994 forced it to accept, reluctantly, that individual PC owners were its customers, too. A company that once proudly eschewed mass marketing and resolutely gave its products numbers rather than names woke up in the '90s and decided to turn itself into a brand name, making its "Intel Inside" slogan as ubiquitous as the McDonald's arch.
Whatever his faults, Grove never seems to have confused precision with rigidity. In Jackson's portrait, he emerges as too smart and too observant to be the classic Angry Boss; his management-by-fear comes off as more a kind of calculated sadism. Earlier this year, Grove published a management primer titled "Only the Paranoid Survive." His paranoia evidently extended to Jackson, who never got an in-depth interview with Grove himself -- and found that the company ordered employees, partners and customers not to talk to him.
It's thus no small feat that "Inside Intel" is as good, and evenhanded, as it is. When engineers need to reproduce a competitor's chip, they "reverse-engineer" it, taking apart the finished product to work backward toward its design. Jackson in effect had to reverse-engineer his history of Intel, working largely from legal documents and interviews with former employees to assemble a complete picture.
It's in these legal dossiers that Intel looks most thoroughly and appallingly like a classic paranoid cult, as it sends in-house lawyers and even security-patrol goons after whistle-blowers or employees suspected of stealing secrets. As with cults everywhere, severe penalties are reserved for traitors who leave the fold.
The lesson of "Inside Intel" is that, for all the company's innovations and its "standard virtues of hard work, good technical support, good record keeping, promptness and market knowledge," its success has been more a matter of marketing prowess than technical excellence. The chip that made Intel what it is today, the 8088 processor, achieved market dominance by accident: Designed as a stopgap to fill a hole in the product line, it wound up as the heart of IBM's first PC in 1981 by sheer happenstance. Despite its limitations, its architecture became the foundation for Intel's most advanced chips today -- while the company's more ambitious efforts to push the limits of processor technology at that time (like the 8800 chip) fell by the wayside.
There were few such accidents on Intel's marketing front. The company was always on guard. Upstart rivals would be hounded by harassing lawsuits, and serious challenges in the market were met by massive company-wide mobilizations -- like Operation Crush, launched in 1978 against a competing Motorola chip that Intel's leaders knew was superior to their own product. "We have to kill Motorola, that's the name of the game. We have to crush the fucking bastards," an exec thundered at the Intel troops. In an exercise straight out of David Mamet's "Glengarry Glen Ross," salespeople competed to rack up "design wins," accounts where they'd convinced clients to build products around Intel's chip instead of Motorola's. The guy with the most design wins got a trip to Tahiti.
Crush succeeded; the Tahiti prize outweighed the Motorola chip's edge. As so often happens in the technology industry, maniacally aggressive marketing trumped sheer technical superiority. A trade-show promoter quoted in Mike Wilson's "The Difference Between God and Larry Ellison" declares: "Average technology and good marketing beat good technology and average marketing every day."
That, certainly, is how Wilson presents the saga of Ellison and Oracle. Oracle started its business in the '70s by grabbing some ideas about the structuring of relational databases from IBM and hurrying them to market. While the lumbering Big Blue labored to build database software that worked reliably, Oracle rushed out a buggy product, seized a big chunk of the market and never let go. Later on it defeated competitors like Ingres who often had superior software but lacked Oracle's go-for-the-kill sales instincts.
Today Oracle's databases efficiently maintain corporate America's "information warehouses," but in the company's early years, its products were a mess. An engineer describes the very first version of its software as "the roach motel of databases: the data went in; it didn't come out." Later versions were regularly released long before they were bug-proofed; cross-platform "ports" of programs would be promised long before they could be delivered. Oracle wasn't the only software company peddling "vaporware," but it perfected the art of actually selling products that didn't yet exist. Sometimes, Wilson reports, Oracle would respond to customers' frantic demands by deliberately shipping them blank or unreadable computer tapes to win a few more days' time.
These practices did not emerge in a vacuum: As Wilson tells it, they sprang straight from the personality of Ellison himself. The Oracle founder was himself "extravagantly and remorselessly late." (He'd have merited a long chewing out from Andy Grove.) Ellison "habitually said things that were provocative or demonstrably untrue," from fudging his college record and exaggerating the toughness of his childhood Chicago neighborhood to selling customers products and features that his engineers hadn't started working on.
As the CEO, so his sales force: Wilson piles up tales of questionable practices salespeople used to meet their constantly doubling quotas -- everything from secret side-deals with big accounts to inventing fictional companies as clients. As such deceptions multiplied to make good on Ellison's vision of infinite growth, Oracle headed for a disastrous fall in 1990. By this time, the firm -- a database company! -- had utterly lost track of its own numbers and was "flying blind," as one exec put it. The "fucking plodding finance guys," as Ellison called them, finally had to be brought in to reform Oracle's business practices and restore its profits.
One thing Oracle never lost through all the turmoil was its share of the market. The choice of database software was a once-a-decade decision, and once a company settled on Oracle, it had little choice but to wait for problems to sort themselves out. The cost of switching software was simply too high.
Wilson's portrait of Ellison in "The Difference Between God and Larry Ellison" is not nearly as scathing as the book's title implies (it's the first half of a joke that concludes, "God doesn't think he's Larry Ellison"). Wilson strives for fairness, offering a sympathetic account of Ellison's ordeal by sexual harassment suit beginning in 1993 (he eventually prevailed in court, proving that the woman who sued him had forged an incriminating e-mail). Ellison emerges as a man who wants to behave like a tough S.O.B. and be loved at the same time. Though Wilson's effort to portray him as "the Charles Foster Kane of the technological age" is a little overblown, the book offers a complex vision of Ellison, not the caricature of the high-tech playboy familiar from business-magazine covers.
Wilson asks Ellison's friends and admirers (Steve Jobs is one) to explain the man's penchant for stretching the truth:
They believed that Ellison lived in the future. When he exaggerated the number of his employees, he was not really lying; he was just getting ahead of himself. "He had a problem with tenses," [Oracle co-founder Edward] Oates said ... Jenny Overstreet, Ellison's longtime assistant, summed things up this way: "There's so much wishing he could make it so ... He doesn't live in today, because there are problems today and there are solutions tomorrow."
If Intel's is a cult of control, Oracle's, plainly, is a power-of-positive-thinking religion -- a revival-tent meeting in the grip of Ellison's optimistic chutzpah. Ellison is like the New Age gurus who peddle "visualization," only instead of picturing personal growth, he chose to visualize a multibillion-dollar software firm -- and delivered.
Ellison is ridiculously wealthy today, and as his fortune has ballooned, his ambitions have evolved. He dabbled in the business of interactive television during the 1994 "information superhighway" bubble. More recently, he has pursued his get-Bill-Gates obsession by propounding the gospel of the network computer. In both cases, he has seemed to embrace the notion that Oracle can do more than just make piles of dough. Ellison has caught Apple fever -- he, too, wants to change the world. (He has also pondered buying Apple itself on more than one occasion; today he sits on its board.)
Ellison's revolutionary spirit remains, at this point, so much vaporware. You can't help thinking his desire to change the world is more a display of ego than a passion for ideas. Of all the computer-biz wizzes and moguls, Gates included, only Steve Jobs can credibly claim any sort of revolutionary mantle: In its early days, Apple was the first company to demonstrate the reach and potential of personal computing, and with the Mac it was the first company to create a computer that was not only useful but artful.
Jim Carlton's "Apple: The Inside Story of Intrigue, Egomania and Business Blunders" chronicles how Apple's revolutionary mantle grew tattered through a decade of arrogance and mismanagement. Carlton is a dogged Wall Street Journal reporter who has assembled a trove of scoops, including a 1985 memo from Gates to Apple outlining a strategy to license the Mac that might have changed personal-computing history. Apple, of course, clung to high profit margins instead, and sunk those profits into one extravagant research project after another, few of which would ever see the light of day. While Apple fiddled, Microsoft and Intel burned up the marketplace. Once again, good marketing beat out good technology.
The Apple saga, well known as it is, remains a true business tragedy. Carlton, alas, fails to give it the dramatic shape it demands: His book is repetitive, a morass of clichés, marred by occasional technical misinformation and hobbled by a narrow focus on Apple's constant executive re-shufflings. Carlton's "Apple" is like a 400-page news story: Though it competently compiles Apple's problems -- paralyzing decision-making by consensus, technically unskilled bosses who gave engineers too much rein, a smug belief that coolness conquers all -- it never finds its way inside the company's soul to discover what went wrong.
One clue can be found in the book's structure. "Apple" begins with Jobs' 1985 departure from Apple and ends with his recent return to the company's helm. When Jobs got forced out by Sculley, the sugar-water magnate he'd hired, Apple experienced the ultimate cult disaster: It lost its leader.
Whatever the merits of Sculley's power play -- and given Jobs' reputation for arrogance, there were doubtless many -- it left Apple adrift. None of the company's other leaders -- from Sculley to Michael Spindler to Gil Amelio -- could replace Jobs in the company mind, even as each had to grapple with the myths he'd created, myths that still governed Apple's behavior. Jobs' celebrated "reality distortion field" held sway over Apple's collective consciousness long after his departure. Sculley himself admits this in retrospect: "Apple always had the DNA of Steve Jobs, even after he was gone ... It was much more of a cult than a real company."
Jobs is back now, but his grand re-entrance is unlikely to rekindle Apple's change-the-world fervor. It may well be too late for him to do anything but put a brave face on disaster. Whether he pulls off some last-minute miracle or goes down with his followers in flames, though, he is achieving something none of his hapless successors at Apple ever managed. He's playing out his role in Apple's mythology, closing a circle, giving the bedraggled cult what it wants.
Whatever happens to Apple from now on, rebirth or annihilation, will have the satisfyingly symmetrical proportions of a religious fable. That's just what today's technology business wants, and what the public expects, from a computer-industry captain.