IBM demos cognitive computer chips
IBM’s effort is the crowning achievement of a "phase zero" and "phase one" contract with the Defense Advanced Research Project Agency (DARPA) to build Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE), IBM and its university partners—Columbia University, Cornell University, and University of California-Merced, and the University of Wisconsin-Madison—now enter "phase two," which extends their efforts for another 18 months with a new infusion of $21 million in funding. DARPA funding the project has received thus far, including the new funding, amounts to $41 million in total.
The eventual goal is to create a brain-like 10 billion neuron, 100 trillion synapse cognitive computer with comparable size and power consumption to the human brain.
"We want to extend and complement the traditional von Neumann computer for realtime uncertain environments," said Dharmendra Modha, project leader for IBM Research. "Cognitive computers must integrate the inputs from multiple sensors in a context dependent fashion in order to close the realtime sensory-motor feedback loop."
Though IBM claims its custom cognitive computing cores are the first of their kind, a rival European program using conventional ARM cores called SpiNNaker— for spiking neural network architecture—was announced last month.
Traditional von Nuemann computers are ill-equipped to deal with the multiple simultaneous data streams coming in from sensors today, but brains handle these easily by distributing processing and memory among its neural networks. In particular, sensors feed neurons down input lines called dendrites.
The neuron integrates over these inputs until a threshold is exceeded, at which point it fires a pulse down its output axon, which is weighted by the synapses connected to other neurons. Pattern recognition is accomplished by the synapses "learning" which connections are used most often, which causes them to grow stronger, while seldom used connections wither away. In this way, the neural network closes the sensory-motor feedback loop, since once a pattern is recognized from the sensor inputs, the output motor neurons mobilize a response.
IBM replicates the brain’s architecture by using a crossbar array to hold the synapses, which then learn which sensory patterns correspond to which desired motor control outputs. The crossbar array connects the neurons to sensor inputs by integrating over a large fan-in of dendrites, then firing output pulses down axons which feed individual synaptic connections to the other neurons in the network.
"Synapses are realized with a crossbar array, in which the vertical lines are the input dendrites and horizontal lines are the output axons," said Modha. "Each neuron fires in order to communicate with the other neurons which fully integrates memory with processor, instead of separating them like von Neumann."
Even though the final cognitive computers will have billions of neurons, they will only consume power when a neuron fires, which happens at the incredibly slow clock speed of 10 Hz. As a result, an entire brain-sized cognitive computer could fit into a shoebox and consume less than a thousand watts.
IBM showed two working prototype chips, both completely digital, which it hopes will serve as the cores of future cognitive computers where thousands will be integrated on multi-core chips.
"A key intellectual step forward was that our chips are all digital, allowing us to simulate on a supercomputer and then implant the results on a silicon chip, resulting in predictable, deterministic behavior," said Modha.
Its two prototypes each use a few million transistors to implement a single core housing just 256 neurons and consuming less than four square millimeters in area using IBM’s 45-nanometer silicon-on-insulator (SOI) complementary metal oxide semiconductor (CMOS) process. The only difference between the two test cores was in their use of the interconnecting crossbar array, either as 256k pre-programmable synapses, or as 64k learning synapses. The chips were fabricated at IBM’s facility in Fishkill, N.Y., and are currently being testing at the T.J. Watson Research Center in Yorktown Heights, N.Y. and at IBM Research in San Jose, Calif.
In operation, IBM’s chips learn from experience, after several learning parameters are set. For instance, one parameter is the threshold level at which neurons fire after integrating over their multiple inputs, allowing faster but cruder operation when set low, or slower but more refined operation when set high. Then as the neurons fire, the learning synapses adapt by changing their weights as they are used. IBM implements the (Donald) Hebb rule, whereby the more a synaptic connection from one neuron to another is used, the more conductive it becomes by virtue of lowering its synaptic weight. Seldom used pathways, on the other hand, inherit higher weights that virtually prune them from the neural network.
IBM envisions its cognitive computers solving a wide variety of applications in navigation, machine vision, pattern recognition, associative memory and classification. So far it has taught one to recognize a cursive letter "7" regardless of in whose handwriting. The other has learned to play (and win against humans) at the game "Pong."