IBM ..interesting
posted on
Nov 21, 2014 10:11PM
A few months back......very interesting.....especially that last line.....
IBM is putting $3 billion towards new chip research.
Christina Welsh/Flickr
Since the computer age began, microchips have consistently been shrunk to smaller and smaller sizes. Moore’s Law, articulated in 1965 by Intel co-founder Gordon Moore, predicts, fairly accurately to date, that the number of transistors we can fit on a microchip will double every 18 to 24 months, constantly increasing computer speed and efficiency. Many computer scientists and engineers, however, believe we will soon reach a point where the traditional chip circuitry made of silicon will be too microscopic to work reliably.
So what’s going to happen? No one is sure yet, but chipmakers are already making moves to safeguard the future of hardware development. This week, IBM announced plans to allocate $3 billion over five years to chip research. While the company's overall R&D expenditures will remain the same, there is a new focus not only on miniaturizing circuitry to 7 nanometers, but also on replacing silicon chips with alternative technologies.
Georgia Tech computer scientist Tom Conte tells _Popular Science _that 7-nanometer transistors are “basically the size of large atoms. There are a lot of unknown quantum effects” that can’t be controlled, so chipmakers can’t guarantee reliable function.
Intel can currently make transistors at 22 nanometers wide, and plans to offer 14 nanometers next year. Moore’s Law has generally held true -- we've been increasing the number of transistors on chips for decades now. But according to Conte, "there's been no big benefit for a while now." From 1994 to 1998, maximum CPU clock speeds rose by 300 percent. Between 2007 to 2011, those speeds increased by a mere 33 percent.
Conte predicts “silicon’s days are numbered. We’ve hit a place where we need to step back and rethink how we design computers.” IBM seems to agree. Their recent announcement cited several different burgeoning technologies that could lead to breakthroughs in chip development, making them not only smaller, but also more efficient and more reliable.
One is quantum computing, where the goal is to increase a computer's operational capabilities. Tradidtional bits of information have values of only 0 or 1, but quantum bits can hold values of 0, 1, or both at the same time, enabling a system to process millions of calculations at the same time.
Another option is to pursue neurosynaptic computing, which uses circuitry that is “based on the structures we see in the brain,” says Conte. The idea is to make computers emulate certain processes that neurological systems excel at, like pattern detection.
Nanophotonics (also known as silicon photonics) process information using pulses of light rather than electrical signals. In their announcement, IBM expressed hopes that nanophotonics could provide “a super highway for large volumes of data to move at rapid speeds between computer chips in servers.”
The current structure of microchips might also remain the same, save for silicon. Carbon nanotubes – single atomic sheets of carbon rolled up into tubes – reportedly perform better 10 times faster than silicon, and could act a simple replacement for transistor material.
None of these technologies, however, have had enough testing. Furthermore, some experts remain ardently skeptical that silicon is even on its way out. “I wouldn’t bet a dollar on any of this stuff,” says MIT computer scientist Srini Devadas. “The quantum stuff is just so far out,” he says, and he doesn’t believe carbon nanotubes or nanophotonics could feasibly compete with silicon in the near future. Transistor miniaturization will probably still slow down considerably once we reach 7 nanometers, but Devadas believes there's still a lot of room for innovation using existing materials. “Why not just develop a variant of silicon that works?” he asks.
Devadas also points out that the $3 billion IBM has pledged is “small peanuts” compared to the hundreds of billions chipmakers like IBM and Intel are already putting into research for silicon innovation. He believes that as silicon transistors continue to shrink, people are anxious to see other technologies usher in a “post-silicon” era, making IBM’s announcement seem more significant than it actually is.
Regardless of how promising other technologies turn out to be, it's pretty clear that silicon is here to stay for at least the next few years. “It’s the incumbent,” says Devadas. “Nothing else can compete.”