What happens when computers stop shrinking?

By around 2020, the age of the ever-smaller chip will come to an end -- and we'd better prepare for it

Published March 19, 2011 9:01PM (EDT)

I remember vividly sitting in Mark Weiser's office in Silicon Valley almost twenty years ago as he explained to me his vision of the future. Gesturing with his hands, he excitedly told me a new revolution was about to happen that would change the world. Weiser was part of the computer elite, working at Xerox PARC (Palo Alto Research Center, which was the first to pioneer the personal computer, the laser printer, and Windows-type architecture with graphical user interface), but he was a maverick, an iconoclast who was shattering conventional wisdom, and also a member of a wild rock band.

Back then (it seems like a lifetime ago), personal computers were new, just beginning to penetrate people's lives, as they slowly warmed up to the idea of buying large, bulky desktop computers in order to do spreadsheet analysis and a little bit of word processing. The Internet was still largely the isolated province of scientists like me, cranking out equations to fellow scientists in an arcane language.

There were raging debates about whether this box sitting on your desk would dehumanize civilization with its cold, unforgiving stare. Even political analyst William F. Buckley had to defend the word processor against intellectuals who railed against it and refused to ever touch a computer, calling it an instrument of the philistines.

It was in this era of controversy that Weiser coined the expression "ubiquitous computing." Seeing far past the personal computer, he predicted that the chips would one day become so cheap and plentiful that they would be scattered throughout the environment -- in our clothing, our furniture, the walls, even our bodies. And they would all be connected to the Internet, sharing data, making our lives more pleasant, monitoring all our wishes. Everywhere we moved, chips would be there to silently carry out our desires. The environment would be alive.

For its time, Weiser's dream was outlandish, even preposterous. Most personal computers were still expensive and not even connected to the Internet. The idea that billions of tiny chips would one day be as cheap as running water was considered lunacy.

And then I asked him why he felt so sure about this revolution. He calmly replied that computer power was growing exponentially, with no end in sight. Do the math, he implied. It was only a matter of time. (Sadly, Weiser did not live long enough to see his revolution come true, dying of cancer in 1999.)

The driving source behind Weiser's prophetic dreams is something called Moore's law, a rule of thumb that has driven the computer industry for fifty or more years, setting the pace for modern civilization like clockwork. Moore's law simply says that computer power doubles about every eighteen months. According to Moore's law, every Christmas your new computer games are almost twice as powerful (in terms of the number of transistors) as those from the previous year. Furthermore, as the years pass, this incremental gain becomes monumental. For example, when you receive a birthday card in the mail, it often has a chip that sings "Happy Birthday" to you. Remarkably, that chip has more computer power than all the Allied forces of 1945. Hitler, Churchill, or Roosevelt might have killed to get that chip. But what do we do with it? After the birthday, we throw the card and chip away. Today, your cell phone has more computer power than all of NASA back in 1969, when it placed two astronauts on the moon. Video games, which consume enormous amounts of computer power to simulate 3-D situations, use more computer power than mainframe computers of the previous decade. The Sony PlayStation of today, which costs $300, has the power of a military supercomputer of 1997, which cost millions of dollars.

So the old paradigm (a single chip inside a desktop computer or laptop connected to a computer) is being replaced by a new paradigm (thousands of chips scattered inside every artifact, such as furniture, appliances, pictures, walls, cars, and clothes, all talking to one another and connected to the Internet).

When these chips are inserted into an appliance, it is miraculously transformed. When chips were inserted into typewriters, they became word processors. When inserted into telephones, they became cell phones. When inserted into cameras, they became digital cameras. Pinball machines became video games. Phonographs became iPods. Airplanes became deadly Predator drones. Each time, an industry was revolutionized and was reborn. Eventually, almost everything around us will become intelligent. Chips will be so cheap they will even cost less than the plastic wrapper and will replace the bar code. Companies that do not make their products intelligent may find themselves driven out of business by their competitors that do.

Of course, we will still be surrounded by computer monitors, but they will resemble wallpaper, picture frames, or family photographs, rather than computers. Imagine all the pictures and photographs that decorate our homes today; now imagine each one being animated, moving, and connected to the Internet. When we walk outside, we will see pictures move, since moving pictures will cost as little as static ones.

The destiny of computers -- like other mass technologies like electricity, paper, and running water -- is to become invisible, that is, to disappear into the fabric of our lives, to be everywhere and nowhere, silently and seamlessly carrying out our wishes.

Today, when we enter a room, we automatically look for the light switch, since we assume that the walls are electrified. In the future, the first thing we will do on entering a room is to look for the Internet portal, because we will assume the room is intelligent. As novelist Max Frisch once said, "Technology [is] the knack of so arranging the world that we don't have to experience it."

We have to ask: How long can this computer revolution last? If Moore's law holds true for another fifty years, it is conceivable that computers will rapidly exceed the computational power of the human brain. By midcentury, a new dynamic will occur. As George Harrison once said, "All things must pass." Even Moore's law must end, and with it the spectacular rise of computer power that has fueled economic growth for the past half-century.

Today, we take it for granted, and in fact believe it is our birthright, to have computer products of ever-increasing power and complexity. This is why we buy new computer products every year, knowing that they are almost twice as powerful as last year's model. But if Moore's law collapses -- and every generation of computer products has roughly the same power and speed of the previous generation -- then why bother to buy new computers?

Since chips are placed in a wide variety of products, this could have disastrous effects on the entire economy. As entire industries grind to a halt, millions could lose their jobs, and the economy could be thrown into turmoil.

Years ago, when we physicists pointed out the inevitable collapse of Moore's law, traditionally the industry pooh-poohed our claims, implying that we were crying wolf. The end of Moore's law was predicted so many times, they said, that they simply did not believe it.

But not anymore.

Two years ago, I keynoted a major conference for Microsoft at their main headquarters in Seattle, Washington. Three thousand of the top engineers at Microsoft were in the audience, waiting to hear what I had to say about the future of computers and telecommunications. Staring out at the huge crowd, I could see the faces of the young, enthusiastic engineers who would be creating the programs that will run the computers sitting on our desks and laps. I was blunt about Moore's law, and said that the industry has to prepare for this collapse. A decade earlier, I might have been met with laughter or a few snickers. But this time I only saw people nodding their heads.

So the collapse of Moore's law is a matter of international importance, with trillions of dollars at stake. But precisely how it will end, and what will replace it, depends on the laws of physics. The answers to these physics questions will eventually rock the economic structure of capitalism.

To understand this situation, it is important to realize that the remarkable success of the computer revolution rests on several principles of physics. First, computers have dazzling speed because electrical signals travel at near the speed of light, which is the ultimate speed in the universe. In one second, a light beam can travel around the world seven times or reach the moon. Electrons are also easily moved around and loosely bound to the atom (and can be scraped off just by combing your hair, walking across a carpet, or by doing your laundry -- that's why we have static cling). The combination of loosely bound electrons and their enormous speed allows us to send electrical signals at a blinding pace, which has created the electric revolution of the past century.

Second, there is virtually no limit to the amount of information you can place on a laser beam. Light waves, because they vibrate much faster than sound waves, can carry vastly more information than sound. (For example, think of stretching a long piece of rope and then vibrating one end rapidly. The faster you wiggle one end, the more signals you can send along the rope. Hence, the amount of information you can cram onto a wave increases the faster you vibrate it, that is, by increasing its frequency.) Light is a wave that vibrates at roughly 10^14 cycles per second (that is 1 with 14 zeros after it). It takes many cycles to convey one bit of information (a 1 or a 0). This means that a fiber-optic cable can carry roughly 10^11 bits of information on a single frequency. And this number can be increased by cramming many signals into a single optical fiber and then bundling these fibers into a cable. This means that, by increasing the number of channels in a cable and then increasing the number of cables, one can transmit information almost without limit.

Third, and most important, the computer revolution is driven by miniaturizing transistors. A transistor is a gate, or switch, that controls the flow of electricity. If an electric circuit is compared to plumbing, then a transistor is like a valve controlling the flow of water. In the same way that the simple twist of a valve can control a huge volume of water, the transistor allows a tiny flow of electricity to control a much larger flow, thereby amplifying its power.

At the heart of this revolution is the computer chip, which can contain hundreds of millions of transistors on a silicon wafer the size of your fingernail. Inside your laptop there is a chip whose transistors can be seen only under a microscope. These incredibly tiny transistors are created the same way that designs on T-shirts are made.

Designs on T-shirts are mass-produced by first creating a stencil with the outline of the pattern one wishes to create. Then the stencil is placed over the cloth, and spray paint is applied. Only where there are gaps in the stencil does the paint penetrate to the cloth. Once the stencil is removed, one has a perfect copy of the pattern on the T-shirt.

Likewise, a stencil is made containing the intricate outlines of millions of transistors. This is placed over a wafer containing many layers of silicon, which is sensitive to light. Ultraviolet light is then focused on the stencil, which then penetrates through the gaps of the stencil and exposes the silicon wafer.

Then the wafer is bathed in acid, carving the outlines of the circuits and creating the intricate design of millions of transistors. Since the wafer consists of many conducting and semiconducting layers, the acid cuts into the wafer at different depths and patterns, so one can create circuits of enormous complexity.

One reason why Moore's law has relentlessly increased the power of chips is because UV light can be tuned so that its wavelength is smaller and smaller, making it possible to etch increasingly tiny transistors onto silicon wafers. Since UV light has a wavelength as small as 10 nanometers (a nanometer is a billionth of a meter), this means that the smallest transistor that you can etch is about thirty atoms across.

But this process cannot go on forever. At some point, it will be physically impossible to etch transistors in this way that are the size of atoms. You can even calculate roughly when Moore's law will finally collapse: when you finally hit transistors the size of individual atoms.

Around 2020 or soon afterward, Moore's law will gradually cease to hold true and Silicon Valley may slowly turn into a rust belt unless a replacement technology is found. Transistors will be so small that quantum theory or atomic physics takes over and electrons leak out of the wires. For example, the thinnest layer inside your computer will be about five atoms across. At that point, according to the laws of physics, the quantum theory takes over. The Heisenberg uncertainty principle states that you cannot know both the position and velocity of any particle. This may sound counterintuitive, but at the atomic level you simply cannot know where the electron is, so it can never be confined precisely in an ultrathin wire or layer and it necessarily leaks out, causing the circuit to short-circuit. According to the laws of physics, eventually the Age of Silicon will come to a close, as we enter the Post-Silicon Era.

Excerpted from Physics of the Future by Michio Kaku Copyright (c) 2011 by Michio Kaku. Excerpted by permission of Doubleday, a division of Random House, Inc. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Michio Kaku is a professor of physics at the CUNY Graduate Center, a co-founder of string field theory and the author of several science books. His new book is "Physics of the Future."


By Michio Kaku

MORE FROM Michio Kaku


Related Topics ------------------------------------------

Computers Nonfiction