Skip to content Skip to main navigation Report an accessibility issue

Neuromorphic Computing Could Improve Performance

Mark Dean

As long as there have been computers, they have been compared to the human brain. The brain has enormous computational power, with some estimates suggesting the equivalent of billions of calculations per second.

Modern high-performance computing systems – often called supercomputers – have caught up to the brain in terms of speed and storage capacity. However, the brain remains a more efficient machine, with very little energy cost to the body, while high-performance computing systems require a tremendous amount of energy to operate. The brain also maintains an edge in terms of flexibility and the ability to learn. Enter neuromorphic computing.

“Neuromorphic computing uses the model of the brain to build systems,” said Mark Dean, John Fisher distinguished professor in the Min H. Kao department of electrical engineering and computer science. “It uses neurons and synapses, which are common in biological systems, to do computation and transfer information.”

According to Dean, neuromorphic computing has the potential to significantly reduce power consumption and take on more complex functions – even to increase the complexity of problems a computer can manage.

“Our goal is not to replicate the brain but to learn from what we know and build a computer that is much more efficient, has much more scalability, and can solve problems that are difficult for computers to solve – like watching a video and identifying a person in that video. Humans can do that but computers have a hard time with those kinds of applications,” said Dean.

In its first year, Dean’s JDRD team worked to develop a scalable neural network structure using a neuromorphic array communications controller and a second-generation dynamic adaptive neural network array applied to an autonomous robot. Graduate student Aaron Young has demonstrated the scalability and flexibility of this structure and plans to submit a publication on his findings in the next year.

At the end of this funding year, Dean hopes to have created a neural network with the ability to scale upward from small applications to large systems like high performance computers.