Machine learning for sensors: Page 3 of 4

August 12, 2019 //By Julien Happich
Machine learning
Researchers at the Fraunhofer Institute for Microelectronic Circuits and Systems IMS have developed an artificial intelligence (AI) concept for microcontrollers and sensors that contains a completely configurable artificial neural network.

Reducing data


AIfES demonstrator for handwriting recognition. Numbers
written by hand on the PS/2 touchpad are identified and
output by the microcontroller.

AIfES doesn’t focus on processing large amounts of data, instead transferring only the data needed to build very small neural networks. “We’re not following the trend toward processing big data; we’re sticking with the absolutely necessary data and are creating a kind of micro-intelligence in the embedded system that can resolve the task in question. We develop new feature extractions and new data pre-processing strategies for each problem so that we can realize the smallest possible ANN. This enables subsequent learning on the controller itself,” Gembaczka explains. The approach has already been put into practice in the form of several demonstrators. Thus for example the research team implemented the recognition of handwritten numbers on an inexpensive 8-bit microcontroller (Arduino Uno). This was made technically possible by developing an innovative feature extraction method. Another demonstrator is capable of recognizing complex gestures made in the air. Here the IMS scientists have developed a system consisting of a microcontroller and an absolute orientation sensor that recognizes numbers written in the air.

“One possible application here would be the operation of a wearable,” the researchers point out. “In order for this type of communication to work, various persons write the numbers one through nine several times. The neural network receives this training data, learns from it and in the next step identifies the numbers independently. And almost any figure can be trained, not only numbers. “This eliminates the need to control the device using speech recognition: The wearable can be controlled with gestures and the user’s privacy remains protected. There are practically no limits to the potential applications of AIfES: For example, an armband with integrated gesture recognition could be used to control indoor lighting. And not only can AIfES recognize gestures, it can also monitor how well the gestures have been made. Exercises and movements in physical therapy and fitness can be evaluated without the need for a coach or therapist. Privacy is maintained since no camera or cloud is used. AIfES can be used in a variety of fields such as automotive, medicine, Smart Home and Industrie 4.0.


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.