It's a very fascinating technology, one that will eventually be widely used. But it's important to understand the different paradigms in computing: digital, quantum, "brain", ect.
For example: A quantum will be superior in very large weather modeling calculations, but terribly sloppy when trying to play a HD video.
They all have their advantages and disadvantages because they take fundamentally different approaches in solving problems. Nothing is ever perfect.
"Such capability could enable new mobile device applications that emulate the human brain’s capability to swiftly process information about new events or other changes in real-world environments, whether that involves recognizing familiar sounds or a certain face in a moving crowd. IBM envisions its new chips working together with traditional computing devices as hybrid machines—providing an added dose of brain-like intelligence for smart car sensors, cloud computing applications or mobile devices such as smartphones. The chip's architecture was detailed in a new paper published in the 7 August online issue of the journal Science."
http://spectrum.ieee.org/tech-talk/computing/hardware/ibms-braininspired-computer-chip-comes-from-the-future