Are microchips too fast for mere mortals?

Moore's Law means our processors get faster every year -- but no law can find uses for all that computing power.


Scott Rosenberg
June 18, 1998 11:00PM (UTC)

Faster! Cheaper! Smaller! The war cries of the personal-computer revolution are now the stuff of popular myth. Even digital neophytes understand that next year's machines will beat the pants off this year's -- and that, just as surely, computer prices will drop.

The computer industry's ability to deliver on those promises of speed, economy and miniaturization depends on a principle known as Moore's Law. In 1965, Intel Corporation co-founder Gordon Moore predicted that the number of transistors that could be crammed onto a microchip would double at a regular interval. Moore has adjusted the exact rate a couple of times; originally set at a year, since the 1970s it has been roughly every 18-24 months.

Advertisement:

Reality -- prodded chiefly by the manufacturing whizzes at Intel -- has so far lived up to Moore's calculations. Prodigious improvements in chip technology are measurable by engineering standards like clock speed, millions of instructions per second and transistors per square inch. Today's computer marketing focuses almost exclusively on processor speed, as in the recent advertising war between Intel and Apple: Intel's commercials feature "Bunny People," clad in the space-suit-like garb that's de rigueur in super-immaculate chip-fabrication plants, dancing in praise of the Pentium II -- while Apple's spots mock the competition as snails and apologize for "toasting" them.

But when the noise dies down, a nagging doubt persists: Sure, our chips work faster. But do we? Or is Moore's Law too fast for mere human beings to keep up?

This isn't an abstract or rhetorical question. For one thing, if all that costly computing power we've invested in hasn't actually boosted our productivity, then the global economy is in big trouble. But while the economists engage in that learned debate, the question has a more personal dimension, too. If chip speeds have outstripped the uses we can put them to, then maybe we don't need to heed the call of the Bunny People. Maybe that new, top-of-the-line machine isn't a must. Maybe the snail's life is good enough for now.

Advertisement:

Aaron Weiss, a writer in Halifax, Nova Scotia, recently bought a new computer built around a 300 mhz Pentium II processor. While he's a self-described "geek" who delights in the system's raw speed, he also says it doesn't accomplish much more for him than his previous 150 mhz Pentium system: "When my days consist of opening and closing Microsoft Word and opening and closing Netscape and surfing around, they're not much different. If some operation used to take me one second, and now I've got a computer that's twice as fast and it takes half a second, how much can I perceive that difference?" (Weiss' essay on the myth of obsolescence accompanies this article.)

Today's computer users are often haunted by the falling-off between what their computer's spec sheets say about its vast capabilities (you've got a zillion times the power of a 1960s mainframe sitting on your desk!) and what they can actually accomplish with it. While our processors crunch twice as much information every 18 months or so, our brains haven't simultaneously provided them with twice as much to do.

Two factors bear primary responsibility for this lag. One is what Dataquest microprocessor analyst Nathan Brookwood calls "application stagnation": the advanced maturity of the marketplace for everyday software like word processors and spreadsheets. The other is the nature of the computer's most popular new application, the Internet.

Advertisement:

A recent column by John Dvorak in PC Magazine surveyed 10 different software categories and rated the prospect for innovative leaps in them as almost uniformly dismal. Software manufacturers led by Microsoft keep trying to add new features to their products to eat up Moore's Law-generated processor power; these "bells and whistles" turn programs into ever more baroque edifices -- but they rarely boost personal productivity or enable us to accomplish something genuinely new. Microsoft wants you to buy a new version of Office every couple of years, and figures you might want to buy a new computer to run it on, too. But the law of diminishing returns is at work here, and there are only so many Page Wizards and barely trustworthy grammar-checkers a person needs.

Such "software bloat" or "feature-itis" is often the culprit behind our feeling that, though our chips may be cycling faster, we users are still running in place. In fact, one obscure but valuable corollary to Moore's Law, attributed to a Swiss professor named Nicklaus Wirth, holds that software gets slower at a faster rate than hardware gets faster.

Advertisement:

Still, computers are shape-shifting devices -- software not only affects their efficiency but can completely transform what we use them to do.
The software industry is built on the concept of the "killer app" -- the application so cool and useful that people will buy the whole computer just to run it. Spreadsheet software like Visicalc and Lotus 1-2-3 sold people on the Apple II and the DOS-based PC; desktop publishing software made the Macintosh. There's a cyclical, jerky dance between software and hardware that moves the computer business forward in lurches. But recently, it's slowed to a crawl.

That's because the most recent "killer app," the Internet itself, is different from its predecessors. Beginning around 1995, getting on the Net became the key motivation for first-time computer buyers. And getting on the Net doesn't demand the fastest processor in the latest box. Unless you're using a PC that's many generations old, your processor isn't what's holding you back when you're online: It's idle much of the time, waiting for information to trickle through your modem.

"The question is no longer, how fast can I do my thing on my computer? It's, how fast can I get what I need to use?" says Aaron Goldberg, executive vice president of Computer Intelligence, a business research firm.

Advertisement:

Even with the faster Net connections many people enjoy in offices -- or with newer high-speed home connections like ISDN, digital subscriber lines and cable modems -- the network often remains a bottleneck: Your computer's line to the Net may be speedy, but the Net can be clogged at any number of points between you and the server you're trying to reach. And even if the Net's having a good day, most Web browsers and Net-based applications will only use a fraction of your computing power; sooner or later, most Internet software leaves today's zippy Pentiums and Pentium IIs twiddling their thumbs.

It's this simple fact -- as much as lower-priced chips produced by Intel's competitors or overproduction in subsidiary markets for memory chips, hard drives and so on -- that has led to the remarkable drop in computer prices over the last year. Rather than pay a premium for the season's latest speed-demon chip, many computer buyers have decided instead to use Moore's Law as a wedge to push prices down.

The rise of the sub-$1,000 PC is a reminder that Moore's Law's payoff of "faster, cheaper, smaller" chips can be sliced in different ways. You can pay the same as last year for a chip that's a lot faster. But if you don't need or want that faster chip, you can also buy last year's chip for a lot less money. Of course, you need to find someone who'll sell last year's chip to you -- and Intel, which profits the most from the high-end chips, has always been loath to do so. But competitors like AMD and Cyrix jumped in when they sensed the demand, forcing Intel to try to catch up in this market.

Advertisement:

"The reason we never had $700 PCs before was not because you couldn't build them, but because they wouldn't do anything you wanted them to do," Goldberg says. "Now the question is, gee, a Pentium MMX 200, with 16 megabytes of memory and 3 gigabytes of hard drive for $699, what doesn't it do? Having more than one of those in my home, I can say, the answer is: nothing."

Responding to this market-wide sea-change, the computer industry has adopted a potpourri of strategies:

1.) Sell to the power junkies. There are, of course, computer users who actually need the fastest processors they can find. Avid gamers can always count on the gaming industry to churn out processor-hungry software; it takes a tough chip to generate realistic-looking three-dimensional monsters on the fly. Graphic designers, multimedia developers and scientists can all make good use of a speedy processor, too. And of course the servers that sit at network hubs and dole out Web pages and database info to scads of simultaneous users can always take advantage of additional speed. The computer manufacturers will continue to profit by meeting these needs -- but they're terrified that their customer pool will shrink to only such users. In this business, "niche" is a dirty word.

2.) Find new jobs for computers. Videoconferencing isn't an everyday use for your computer today; Intel hopes to change that. The chip manufacturer has been trying to jump-start this market because it's one of the few new applications that might actually put Moore's Law back to work and give people a reason to buy a Pentium II. First, though, it will have to work well and be on enough people's systems to have some practical value. Intel has also sunk large sums of money in advertising campaigns channeled through Web sites that are "optimized" for the Pentium II chip -- sites that demand a beefy processor to keep up with features like video streaming and 3-D virtual-reality spaces. So far, though, most of these technologies remain crude and unreliable and haven't caught the popular imagination. It's hard to sell people on videoconferencing when they're still getting used to e-mail.

Advertisement:

3.) Speed up the Net. If the Internet itself is the bottleneck preventing people from using their processor power, goes this line of industry thinking, then let's speed it up. This is why you'll now find computer companies investing across the board in all kinds of bandwidth-boosting technologies, even ones that directly compete with each other -- like the cable companies' cable-modem service and the telephone companies' Digital Subscriber Line approach. (This week, Microsoft and Compaq each bought
10 percent stakes in Road Runner, the Time Warner-founded cable-modem venture.) Still, while the speed of Internet connections is certainly improving, it doesn't move at anything like the pace of Moore's Law's steady doubling. Physical constraints and economics apply more of a brake in the everyday world of wires than in the pristine realm of the chip factory. "It's one thing to build wafers, another thing to plant cable," Goldberg says. "Moore's Law does not apply to backhoes."

There's another strategy for weathering this period of stagnation that few companies will embrace but that has some history behind it: Simply wait. A decade ago the computer industry found itself similarly becalmed; Intel's powerful and expensive new 386 processors weren't catching on, mostly because DOS -- the operating system most users and most software depended on -- couldn't do much more with it than with older, cheaper chips. Intel launched a famous ad campaign that attacked its own older product line, featuring its popular 286 chip in a circle with a big red diagonal slash through it. But users waited to upgrade and embrace the 386 until Microsoft gave them a reason to need it: the new graphic operating system Windows, in its semi-stable and reasonably usable 3.0 version, which not only used the 386's horsepower but made it feel slow. (Similarly, Windows 95 gave people reason to trade in their 386- and 486-based machines for Pentiums.)

Dataquest analyst Nathan Brookwood argues that it will take a similarly major transition in computing to motivate users to buy the next generation of processors. "There's 40 years of history that says the software guys always do catch up," Brookwood says. "The opportunities for them to enrich themselves by finding ways to use the computing power that's now available are so phenomenal that they're highly motivated to do that -- just as the guys who are making the chips smaller and smaller, faster and faster are highly motivated to do that."

That, in other words, is the inspiration behind Bill Gates' magical vision of the future of computing -- in which voice recognition works really well, and smart software learns from our every move. The interfaces of the future may well give us all reason anew to trade up to Intel's latest processors running Microsoft's most updated software.

Advertisement:

But there's another possibility: Faster, cheaper, smaller. Moore's Law is, after all, about miniaturization. Processors get faster and cheaper because more transistors are squeezed into the same sliver of silicon. Maybe Moore's Law -- having produced chips that quickly do most of what we want them to do today, and then having driven down their prices -- will next drive the market toward tinier devices. In other words, instead of taking the "twice as many transistors in the same space!" route, we'll opt for "the same number of transistors in half the space!"

This is precisely what a new report from IDC, the industry research firm, suggests: In "Death of the PC-centric Era," IDC predicts that a new wave of Internet-access devices -- "TV set-top boxes, Web-enabled telephones, Web-enabled personal digital assistants and Web-enabled videogame consoles" -- will take the lion's share of the new Internet market away from the PC. If all we need to use the Net is the power of yesterday's PC -- and yesterday's PC now fits in your hand -- then how many of us are going to need tomorrow's PC?

The author of the IDC report, Frank Gens, argues that this change is already under way, and it's as big as the transition to the personal computer was two decades ago: "The appliance model may do to the PC guys what the PC market did to the mainframe and mini [computer] guys."

Ironically, then, just as Moore's Law made today's thriving PC industry possible, it could also bring the business down. Unless the PC industry finds a way to kick-start us out of "application stagnation," large numbers of users -- particularly those who chiefly use their PCs for e-mail and the Web -- may simply abandon those big boxes on their desks for something a little more handy.

Advertisement:

Whatever happens, Moore's Law remains the great engine of change in digital technology: Its exponential curve forces everyone in this field to stay light-footed on constantly shifting ground. But Moore's Law may not be forever: In a speech last fall, Gordon Moore said that his principle is on track to continue its inexorable advance for another 20 years -- at which point it's likely to "run out of gas," as ultra-thin transistor layers hit the limits of atomic-particle size.

This prospect fills some commentators with dread and causes others to lament the "end of the digital frontier." But since the speed of chip development keeps so much of the computer industry on an inhuman treadmill, a slowdown might have some hidden benefits. One of the side-effects of Moore's Law is to crank up the pace of software development so fast that companies won't, or can't, test new products adequately. If there's a stall in the development of hardware, maybe software will have a chance to catch up.

Maybe we'll get around to fixing all the things about computers that still don't work right. Maybe programmers will rediscover the art of writing code that's fast and lean and doesn't crash. And maybe computer manufacturers will start to differentiate their products in more useful ways, if they can no longer count on our knee-jerk willingness to buy a new box every three years.

David Coursey, editor of Coursey.com, a computer-industry newsletter, looks forward to this prospect: "I've been waiting for a pause in the industry, so that all the bug fixes could actually take place -- and for one golden moment, Microsoft Windows would work right."

But he also says he's not holding his breath.


Scott Rosenberg

Salon co-founder Scott Rosenberg is director of MediaBugs.org. He is the author of "Say Everything" and Dreaming in Code and blogs at Wordyard.com.

MORE FROM Scott Rosenberg



Fearless journalism
in your inbox every day

Sign up for our free newsletter

• • •