Aiming to become the global leader in chip-scale photonic solutions by deploying Optical Interposer technology to enable the seamless integration of electronics and photonics for a broad range of vertical market applications

Free
Message: Article is from Forbes Magazine.

The following article is from Forbes Magazine.. Had some insightful points. Sent to me by one of my graduate students… He commented:” Poet is very interesting and could be a disruptor.  They are definitely on the cutting edge with their improvements in gallium arsenide semiconductors and system on a chip designs.  Gallium arsenide solves a lot of heat issues but has always been substantially more expensive than silicon.  The market is going to race to 5nm over the next 5 years at which point my understanding is you can't go smaller or electrons become unstable.  I wonder if there is one more innovation left in silicon which will continue to price POET out of the market.  This could be a 3d stacking processor or a Neuromorphic design for a chip.  My prediction would be that we see a few emerging technologies come to market in the near future and compete.  We are probably either looking at the future with POET or it is coming to market soon.”.

Not my field.. But I certainly thought it was worth the read and supports the notion of turbulence and disruption.. For those with the time and interest.. See what you think… Tpower

 

These 4 Major Paradigm Shifts Will Transform The Future Of Technology

See https://www.forbes.com/sites/gregsatell/2016/05/15/these-4-major-paradigm-shifts-will-transform-the-future-of-technology/#364be7ca30b0

For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.

Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.

Over the next decade, Moore’s Law will end. Instead of replacing manual labor, technology will automate routine cognitive work. As information technology fades into the background, second order technologies, such as genomics, nanotechnology and robotics will take center stage.

 

Here are the four major paradigm shifts that we need to watch and prepare for.

From The Chip to The System

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years. He also predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.

That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations.

 

Yet Moore’s law is now nearing its end. The problem is twofold. First, there are only so many transistors you can squeeze onto a chip before quantum effects cause them to malfunction. Second, is the problem known as the von Neumann bottleneck. Simply put, it doesn’t matter how fast chips can process if they need to wait too long to communicate with each other.

 

So we have to shift our approach from the chip to the system. One approach, called 3D stacking, would simply combine integrated circuits into a single three dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it could increase speeds significantly and allow progress to continue.

 

From Applications To Architectures

Since the 1960’s, when Moore wrote his article, the ever expanding power of computers made new applications possible. For example, after relational databases were developed in 1970, it became possible to store and retrieve massive amounts of information quickly and easily. That, in turn, dramatically changed how organizations could be managed.

Later innovations, like graphic displays, word processors and spreadsheets, set the stage for personal computers to be widely deployed. The Internet led to email, e-commerce and, eventually, mobile computing. In essence, the modern world is little more than the applications that make it possible.

Till now, all of these applications have taken place on von Neumann machines—devices with a central processing unit paired with data and applications stored in a separate place. So far, that’s worked well enough, but for the things that we’ve begun asking computers to do, like power self-driving cars, the von Neumann bottleneck is proving to be a major constraint.

 

So the emphasis is moving from developing new applications to developing new architectures that can handle them better. Neuromorphic chips, based on the brain itself, will be thousands of times more efficient than conventional chips.

 

Quantum computers, which IBM has recently made available in the cloud, work far better for security applications. New FPGA chips can be optimized for other applications.

Soon, when we choose to use a specific application, our devices will automatically be switched to the architecture—often, but not always, made available through the cloud—that can run it best.

From Products To Platforms

It used to be that firms looked to launch hit products. If you look at the great companies of the last century, they often rode to prominence on the back of a single great product, like IBM’s System/360, the Apple II or Sony’s Walkman. Those first successes could then lead to follow ups—like the PC and the Macintosh—and lead to further dominance.

 

Yet look at successful companies today and they make their money off of platforms. Amazon earns the bulk of its profits from third party sellers, Amazon Prime and cloud computing, all of which are platforms. And what would Apple’s iPhone be without the App Store, where so much of its functionality comes from?

 

Platforms are important because they allow us to access ecosystems. Amazon’s platform connects ecosystems of retailers to ecosystems of consumers. The App Store connects ecosystems of developers to ecosystems of end users. IBM has learned to embrace open technology platforms, because they give it access to capabilities far beyond it own engineers.

 

The rise of platforms makes it imperative that managers learn to think differently about their businesses. While in the 20th century, firms could achieve competitive advantage by optimizing their value chains, the future belongs to those who can widen and deepen connections.

 

From Bits To Atoms

In The Rise and Fall of American Growth, economist Robert Gordon argues that the rapid productivity growth the US experienced from 1920-1970 is largely a thing of the past. While there may be short spurts of growth, like there was in the late 90’s, we’re not likely to see a sustained period of progress anytime soon.

 

Among the reasons he gives is that, while earlier innovations such as electricity and the internal combustion engine had broad implications, the impact of digital technology has been fairly narrow.

 

The evidence bears this out. We see, to paraphrase Robert Solow, digital technology just about everywhere except in the productivity statistics.

 

Still, there are indications that the future will look very different than the past. Digital technology is beginning to power new areas in the physical world, such as genomics, nanotechnology and robotics, that are already having a profound impact on such high potential fields as renewable technology, medical research and logistics.

It is all too easy to get caught up in old paradigms. When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.

In an age of disruption, the only viable strategy is to adapt.

Greg Satell is a popular speaker and consultant. His first book, Mapping Innovation, is coming out in 2017. Follow his blog at Digital Tonto or on Twitter @DigitalTonto.

 

 

Share
New Message
Please login to post a reply