A Microchip That Mimics The Human Brain

A new brain-like microchip has 1 million electronic neurons and more than 256 million artificial synapses connecting. IBM researchers and their colleagues, with money from DARPA and technology from Samsung, designed the breakthrough device, outlined today in the journal Science.

“This chip represents a completely new architecture for microchips,” says study co-author Dharmendra Modha, a computer scientist at IBM Research-Almaden in San Jose, Calif. “It’s a new machine for a new era.” The scientists say the chip has incredible potential: It could be used in glasses for the visually impaired, medical imagers that can spot early signs of disease, and even driverless cars.

Copying Nature

Brains are the most powerful computers we know: The human brain has approximately 100 billion neurons, with roughly one quadrillion (one million billion) connections between them. The way in which these connections or synapses wire neurons together encodes our memories.

Scientists have long sought to mimic how the brain works using software programs known as neural networks and hardware known as neuromorphic chips. The basic building block of the new chip is what the researchers call a neurosynaptic core, each one consisting of 256 output lines and 256 input lines. These lines mimic neurons, the cells making up the brain, and axons, the branches extending from neurons used to communicate with other cells, respectively.

The researchers connected 4,096 such cores into a system 4.3 square centimeters in size. Named TrueNorth, the chip has 5.4 billion transistors and roughly 428 million bits of on-chip memory. Potentially, the scientists say, those chips could be combined like tiles into sheets of virtually any size.

See also  The Plan to Turn Elephants Into Woolly Mammoths Is Already Underway

The layout of the chip (left) shows that its architecture comprises a 64x64 array of “neurosynaptic cores.” Each core (right) implements 256 neurons and 65,536 synapses and tightly integrates computation, memory, and communication. (Photo Credit: IBM Research)

Saving Energy

In a conventional microchip, which uses what’s called Von Neumann architecture, the central processing unit is separated from the memory by a connection known as the bus. The problem with this design is that the bus serves as a bottleneck, and the constant relaying of data between the processor and memory consumes energy. In earlier supercomputer simulations, in which Modha and his colleagues simulated 530 billion neurons and more than 100 trillion synapses, the system consumed 12 gigawatts.”[That’s] more than New York, Los Angeles, and a regular city in the Midwest combined,” Modha says.

In TrueNorth, though, each neurosynaptic core holds both a processor and memory, which saves energy. And unlike conventional microchips, which use clock signals fired at regular intervals to coordinate the actions of circuits, activity in the neurosynaptic cores is driven only when an electrical charge reaches a specific value, much like what happens in real brains.

“The power density of this chip is just 20 milliwatts per square centimeter,” Modha says. “That’s four orders of magnitude cooler than today’s microprocessors, which have a power density of 100 watts per square centimeter.”

Whereas computation in modern supercomputers is typically measured by floating-point operations per second (FLOPS), in TrueNorth computation is measured using synaptic operations per second (SOPS). TrueNorth can deliver 46 billion SOPS per watt, whereas today’s most energy-efficient supercomputer achieves 4.5 billion FLOPS per watt, the researchers said.

“It’s exciting to see large-scale neuromorphic computing arrive in the scientific community at large,” said neuroscientist and computer scientist Michael Schmuker at the Free University of Berlin, who did not take part in this study.

See also  Learn what the Northern Lights are and how they are formed

Milk and Cookies

Neural networks and neuromorphic chips often are used for pattern recognition tasks where conventional computers do poorly, such as identifying pictures. In experiments, this microchip could identify people, bicyclists, cars, trucks and buses seen in 400-pixel-by-240-pixel video input at 30 frames per second.

Modha noted that neuromorphic chips would not replace conventional microchips, but rather work side by side with them. If you want to do analytics and number-crunching, he says, you should use conventional chips. “But if you want pattern recognition, to recognize a friend’s face in a crowd, use synaptic devices,” Modha says. “The two kinds of computing are meant to go together, like yin and yang or milk and cookies.”

In the future, the scientists envision augmenting their neurosynaptic cores with synaptic plasticity, the ability for synapses to vary in strength over time. This would allow experiences to change the chips, enabling learning. The team also wants other researchers to learn to program these chips and experiment with them.

“If neuromorphic computing is really to rival conventional von Neumann computing, we need a huge community of developers who create neuromorphic solutions to practical computing problems,” Schmuker says. “This will only happen if the technology is available to the entire scientific community.”

Article By Charles Q Choi

See also  Our Star-Trek Future? NASA Scientists Engineering a Warp-Drive Solution for Faster-Than-Light Space Travel

Leave a Reply