Stashing algorithm cuts AI energy consumption

Stashing algorithm cuts AI energy consumption

Technology News |
By Nick Flaherty

Researchers in Korea have proposed a new system to reduce the energy consumption in spiking neuromorphic AI networks that can be used with today’s chip designs.

The research group led by Professor Kyung Min Kim from the Department of Materials Science and Engineering at KAIST in Seoul has developed a technology that can efficiently handle AI operations by imitating the continuous changes in the topology of the neural network according to the situation.

Using neural networking techniques that mimic brain activity with neuromorphic or spiking neural network designs, is seen as a way to reduce the high power consumption of AI accelerators in sensing and vision applications.

The human brain changes its neural topology in real time, learning to store or recall memories as needed. The research group developed the new learning method by directly implementing these neural coordination circuit configurations.

Neuromorphic AI articles 

The team used a self-rectifying synaptic memristor array and an algorithm called a ‘stashing system’ that was developed to conduct artificial intelligence learning. As a result, it was able to reduce energy by 37% within the stashing system without any accuracy degradation.

The array uses a memristive dot product engine (MDPE) that has also been used for vector processing. The array includes a crossbar array where each memory element includes a memristor.

“In this study, we implemented the learning method of the human brain with only a simple circuit composition and through this we were able to reduce the energy needed by nearly 40 percent,” said Kim.

The stashing system is compatible with existing electronic devices and commercialized semiconductor hardware. It is expected to be used in the design of next-generation semiconductor chips for artificial intelligence.

Related neuromorphic AI articles

Other articles on eeNews Europe


If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles