Aiming to become the global leader in chip-scale photonic solutions by deploying Optical Interposer technology to enable the seamless integration of electronics and photonics for a broad range of vertical market applications

Free
Message: Cool Article from Gordon Moore. 2005
Gordon Moore

Will Moore's Law Ever Stop Working?

June / July 2005By: Dean TakahashiVolume 3 Number 3

In 1965, Gordon Moore predicted that the number of transistors on a chip would double every year. In 1975, he amended that prediction to doubling every two years. And 40 years later, the prediction is still holding up.

No one is more surprised than Moore, a former Intel CEO, at how his seemingly mundane prediction, buried in the 35th anniversary issue of Electronics magazine, would come to define the progress of technology in the modern world.
"I wanted to get across the idea that integrated circuits were a way to make electronics cheap," Moore said in a recent press conference commemorating the 40th anniversary of the prediction. "I made this extrapolation. It turned out to be much more accurate than it had any reason to be."

He jokes that Moore's Law is a violation of Murphy's Law. When you shrink a transistor, the basic building block of an electronic circuit, everything gets better. The chip that uses the transistor becomes faster, cheaper, more reliable and smaller.

Moore's Law is the backbone behind the $213 billion chip industry and the $1 trillion electronics industry that it supports. It is why Intel has been able to move from putting 2,300 transistors on its first microprocessor in 1971 to 1.7 billion transistors on an upcoming Itanium microprocessor this year. That kind of staggering advance has put the power of an aging supercomputer into every cell phone, or even a child's toy. The cost of a transistor has fallen from $5.52 in 1954 to 191 billionth of a dollar in 2004, according to market analyst Dan Hutcheson, president of VLSI Research.

"Moore's Law was the bellwether," said Bob Colwell, an independent consultant in Portland, Ore., and a former Intel chip architect. "It told you where you had to be if you were designing a chip that was going to come out a few years from now."

It follows that nothing could be more important than keeping Moore's Law going. Among the biggest efforts to make sure that chip manufacturers can stay on the path is a consortium of national research labs and top companies to create a critical tool that will enable the continued miniaturization of transistors. This $250 million research effort, which is just one of many, illustrates the scale of the challenge involved.

"I would say this consortium focused on the biggest obstacle to the continuation of Moore's Law," said John Goldsmith, the program manager at Lawrence Livermore.

The consortium around Extreme Ultraviolet Lithography began in 1997. Researchers at the Sandia National Laboratory and Lawrence Livermore National Laboratory had been looking into the technology for more than a decade. Its purpose was to find a new way of writing patterns on chips when the old methods ran out of steam. So the labs teamed up with Intel, IBM, Advanced Micro Devices, Motorola, Micron Technology, and Infineon to commercialize EUV tools.

The companies contributed money while the federal researchers collaborated to develop a prototype tool. Goldsmith says they created a tool that will be able to print circuits on chips that are a hundred times smaller than the features that can be printed by conventional lithographic tools.

"It essentially allows you to paint a finer line when you draw circuits," Goldsmith says. "You use a finer brush."

The researchers have completed the work, and the chip makers are now in the process of handing over the technology to chip manufacturing equipment makers, who will bring the tools to market in the 2009 time frame. Paolo Gargini, a senior fellow at Intel, says that his company will rely upon EUV tools at that time, but other chip makers who can't afford the hefty upfront costs will likely delay a few years.

Gargini notes that EUV research, as big as it is in terms of expense, is just one piece in the work to keep Moore's Law going. The law has given the industry great riches, but it has also burdened it with a requirement to stay on a relentless treadmill.

It is getting harder and harder to cram more transistors onto a single chip and to use those transistors productively so that computing performance keeps advancing. Besides printing smaller transistor designs, another looming problem is how to deal with the rising power consumption of chips. Cramming more transistors on a chip will soon concentrate too much heat in a given space, causing chips to melt down unless a way to reduce the heat is found.

Within 15 years, the chip industry now believes it will have to move beyond silicon chips in order to stay on the pace of Moore's Law. The growing obstacles to chip advances has put enormous pressure on the industry to finance research into alternatives.

Why the fuss? If Moore's Law comes to a stop, the consequences could be felt throughout the world economy. Consumers buy technological marvels every few years or so because they're much better than the old gadgets they replace. That improvement in gadgetry is possible because of Moore's Law. Dale Jorgensen, an economist at Harvard University, figures that the information technology industry that is driven forward by Moore's Law accounts for a quarter of economic growth, a disproportinate percentage given the IT industry's size.
So much is at stake that the industry is worried about whether current research will pan out in time to step into the gap when current technologies run out of steam. What comes next is anybody's guess.

Moore himself says he can never foresee more than five years into the future. He is confident that the industry will continue to deliver continuous advances, but he doesn't know how it will get there. He does believe that silicon itself, the foundation for making transistors for the past four decades, will hit its physical limits at some point. And when it does, something else has to take its place.

Gargini says 15 years isn't a lot of time to solve the technology problems and to replace the silicon manufacturing infrastructure. But he says the chip industry is prepared to introduce one new breakthrough at a time on a continuous basis.
In the past couple of years, chip makers such as Intel will produce chips with transistors that have 90-nanometer widths. (A nanometer is a billionth of a meter). These chips use a breakthrough dubbed strained silicon, which stretches out transistors so that they can conduct electrons faster. Gargini estimates that this technique will be useful until 2009.

In 2007, Intel believes a new kind of insulating material called "high k dielectrics" will be introduced to reduce power dissipation in overheating chips. EUV could also begin testing in the same time frame and enter production in 2009. So for the next five years, the industry isn't worried.

At about 2010 to 2015, the future becomes fuzzy. At that time, the industry will likely create a hybrid of nanotechnology and silicon manufacturing.
Nanotechnology, which has been under research at federal labs such as Sandia for decades, can potentially introduce a whole new way of making things. For years, chip makers have taken bulk materials and honed them down into useful devices. With nanotechnology, researchers create designer molecules that can assemble themselves, or essentially grow, into the desired device.

Researchers want to fuse carbon nanotubes, which are graphene sheets rolled up into tubes that conduct electricity, with a traditional silicon chip. The tubes can serve as the electrical wiring to connect memory cells together. It isn't easy to manipulate the spaghetti-like tubes. But efforts are under way to make the materials assemble themselves into uniform grids atop a chip. Stan Williams, a senior fellow at Hewlett-Packard, believes that other nanotechnology efforts will yield results much sooner than the Intel researchers are predicting.

"The introduction of new technology should begin in about seven years time," Williams says.

Besides nanotechnology, another candidate that could replace silicon in the 2020 time frame include optical computing, which harnesses laser light to do electronic computing. Another is spintronics, where the principles of very small magnets could be tapped to perform calculations at lower power levels than silicon uses.

Moore said that he is skeptical that anything will truly replace silicon manufacturing techniques. The problem, he says, is not just making one device in the labs but connecting a billion of them together.

Carver Mead, the former CalTech professor who coined the term "Moore's Law," said that if chip manufacturing technology runs out of gas, that will put more pressure on chip designers. He feels that chip architects have gotten a free ride out of Moore's Law. He believes that creating architectural innovations, such as creating chips that process the same way that the brain does, are key to better exploiting existing chip technology.

As difficult as the technological problems have become, history has shown that it's a good bet to be optimistic.

"Maybe it will slow down to doubling every three or four years when we get to a barrier," Moore says. "Or we could make bigger chips to make more complex circuits. There's a way out."

Share
New Message
Please login to post a reply