moore's Law is a celebrated principle laid down by Intel founder Gordon Moore, which decrees that the number of transistors you can pack onto a chip will double every 18 months. It's thanks to Moore's Law that the computer you buy today costs roughly the same as the one you bought a few years ago -- but runs two or three times as fast. (You can also blame Moore's Law for making you think that you must have that newer, faster computer -- but that's another story.)
If you've been following the headlines recently, you know that something funny is happening to Moore's Law. It's "no longer true." Or it's about "to hit a wall." The process of chip development is speeding up -- but wait! It's about to slow down!
The latest round of kookiness about Moore's Law began on Sept. 17 with a front-page New York Times story by John Markoff. Headlined, goofily, "New Chip May Make Today's Computer Passé," Markoff's piece reported innovative new design techniques at Intel that basically allow engineers to pack twice as much information into the same chips. The story made clear that the technique is currently limited to a specialized form of chip known as flash memory. When most people think of Moore's Law and its implications they think about the microprocessor CPUs (Pentium, PowerPC and so on) that sit at the heart of today's computers. But Markoff's story nonetheless suggested that Intel's breakthrough "throws Moore's Law out the window."
Five days later, on the front page of the Times' business section, a story by Laurence Zuckerman reported another breakthrough, this one at IBM -- where scientists had figured out how to substitute copper for aluminum in microscopic chip circuitry, allowing an even greater degree of miniaturization than previously thought possible. Moore's Law, Zuckerman wrote, "must now be rewritten." In fact, the IBM breakthrough was "perhaps more significant" than the Intel announcement because it could be applied to all kinds of chips, and not just Intel's flash memory.
Throw it out the window? Rewrite it? Poor Mr. Moore! Technology's advancing even faster than he predicted. But wait a second -- didn't we read a year earlier, on the Times' very own Web site, about the "demise" of Moore's Law? According to that piece, by Ashley Dunn, Intel's engineers were actually falling way behind Moore's schedule of chip improvement rates : "With the introduction around the millennium of Intel's next generation P7 microprocessor, with about 10 million transistors, the company will have fallen hopelessly behind even the 18-month cycle, which predicts nearly 170 million components in 2000." If Dunn's math is right, then all the hoopla in Markoff's and Zuckerman's pieces about the chip business accelerating way past Moore's predictions is bunk, and these new innovations are actually proving Moore right -- they're allowing the chip manufacturers to keep up with his original forecast.
There's yet further confusion: It turns out that there's disagreement over what Moore said in the first place. Most statements of Moore's Law peg it at the double-every-18-month rate. Yet Dunn's article reports that Moore's original 1965 prediction was an annual doubling, and chides Intel for "Stalinesque" revisions of the number to a current double-every-two-years prediction. Over at USA Today, Kevin Maney actually talked to Moore, who said he'd originally charted an annual doubling in 1965 and then revised it in 1975 to a double-every-two-years pace. So God only knows where the 18-month figure came in; maybe somebody just cut the difference between the 1965 and 1975 rates, and it stuck.
In the end, the real story about Moore's Law remains what it's been for several years now: Sometime in the next 10 to 20 years the pace of chip development will slow, as engineers get down so close to the level of individual atoms that they can't squeeze in any more circuits. CNet interviewed Moore earlier this week, and he confirmed the prognosis: "Some time in the next several years we get to some finite limits, but not before we get through five generations" -- five more rounds of microchip development, by which Pentium will have become something like Decium.
As Moore joked, "That's well beyond my shift." By then the technology writers will have to find someone else's law to kick around.