Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
A few weeks ago, in June, I wandered over to IBM’s (s ibm) research facility in Almaden, Calif. to see what Big Blue was doing in the fields of materials research and semiconductor manufacturing. At that time I sat down with Dharmendra Modha, manager of cognitive computing at IBM Research Almaden to discuss his project, which is trying to simulate the way brains work in hopes of advancing the way our computers can process information in real-time by changing the basic architecture of the chip. Or as Modha says, offer up “a sense of how the brain gives rise to the mind.”
The research done today may never yield tangible changes in semiconductor architecture, and even if it does it’s decades into the future. But the issue of solving the ever-increasing demand for compute, without creating a similarly overwhelming demand for electrical power is at the heart of what Modha is trying to do. The work of visualizing how brains think could one day show IBM how to build better computers.
To eventually build those computers, IBM is building out a new lab for Modha, which will contain 16 monitors capable of representing
2. 64 million neurons, with each pixel representing a neuron. Then researchers will use those neural maps to see how the brain reacts to stimuli. It’s no small task. For example, a cat brain, which Modha has simulated, contains 700 ,000 million neurons with trillions of connection between them. Writing algorithms that show all of that is a daunting task.
But tomorrow Modha will publish a paper detailing their latest achievement – mapping a monkey’s brain, which is far more complicated, and gets the lab closer to mapping out a human’s mind. The goal of such visualizations is to help advance computing by changing the way they solve problems. It’s not a means to build artificial intelligence, so much as it’s a way of discovering how to architect new types of chips that can keep up with a barrage of real time information.
The video (see below) of our conversation gets fairly deep, but as Modha explains, the effort is an attempt to combine supercomputing, nanotechnology and neuroscience. He’s trying to apply the advances made in understanding the anatomy of the human brain by filtering it through a supercomputer, with the end goal of creating some type of computer built using new technologies that allow the future machine to be smarter and more power efficient. IBM isn’t the only entity interested in such work – the U.S. government has given IBM and Modha DARPA grants worth more than $20 million to work with – and companies from Intel to HP are also pondering ways to push computing to the limits (GigaOM Pro sub req’d).