MENU

Nanomagnets promise less power-hungry AI

Nanomagnets promise less power-hungry AI

Technology News |
By Rich Pell



While AI is catching up to the human brain in many tasks, it usually consumes a lot more energy – often thousands of times as much – to do the same things. The hardware used in traditional silicon chip-based computer versions of neural networks can also lag, making AI slower, less efficient and less effective than human brains.

A less energy-intensive approach, say the researchers, would be to use other kinds of hardware to create AI’s neural networks. One device that shows promise is a magnetic tunnel junction (MTJ), which is good at the kinds of math a neural network uses and only needs a comparative few “sips” of energy.

Other novel devices based on MTJs have been shown to use several times less energy than their traditional hardware counterparts. MTJs also can operate more quickly because they store data in the same place they do their computation, unlike conventional chips that store data elsewhere. Perhaps best of all, say the researchers, MTJs are already important commercially: They have served as the read-write heads of hard disk drives for years and are being used as novel computer memories today.

Though the researchers say they have confidence in the energy efficiency of MTJs based on their past performance in hard drives and other devices, energy consumption was not the focus of their study. They needed to know in the first place whether an array of MTJs could even work as a neural network. To find out, they took it for a virtual wine-tasting.

Scientists with NIST’s Hardware for AI program and colleagues from the University of Maryland fabricated and programmed a very simple neural network from MTJs provided by their collaborators at Western Digital’s Research Center in San Jose, California. First, the researchers trained the network using 148 of the wines from a dataset of 178 made from three types of grapes.

Each virtual wine had 13 characteristics to consider, such as alcohol level, color, flavonoids, ash, alkalinity and magnesium. Each characteristic was assigned a value between 0 and 1 for the network to consider when distinguishing one wine from the others.

“It’s a virtual wine tasting,” says NIST physicist Brian Hoskins, “but the tasting is done by analytical equipment that is more efficient but less fun than tasting it yourself.”

The system was then given a virtual wine-tasting test on the full dataset, which included 30 wines it hadn’t seen before. The system passed with 95.3% success rate. Out of the 30 wines it hadn’t trained on, it only made two mistakes – a good sign, say the researchers.

“Getting 95.3% tells us that this is working,” says NIST physicist Jabez McClelland.

The point, say the researchers, is not to build an AI sommelier. Rather, this early success shows that an array of MTJ devices could potentially be scaled up and used to build new AI systems.

While the amount of energy an AI system uses depends on its components, using MTJs as synapses could drastically reduce its energy use by half if not more, say the researchers. As a result, this could enable lower power use in applications such as “smart” clothing, miniature drones, or sensors that process data at the source.

“It’s likely that significant energy savings over conventional software-based approaches will be realized by implementing large neural networks using this type of array,” says McClelland.

For more, see “Implementation of a Binary Neural Network on a Passive Array of Magnetic Tunnel Junctions.”


Share:

Linked Articles
10s