MENU

Second €8m phase for AI smart hearing aid project

Second €8m phase for AI smart hearing aid project

Technology News |
By Nick Flaherty



Researchers in Germany are heading into a second stage of a project to develop smart hearing aids that use machine learning to adapt to the individual needs of the user.

The team at the Collaborative Research Centre (CRC) Hearing Acoustics, led by Prof. Dr. Volker Hohmann at the University of Oldenburg, have been working on smart hearing aid designs for the last for years and have now received €8.1m to develop hearing aids and hearing assistance systems that use artificial intelligence to automatically adjust to different environments.

The device determines the direction of the test person’s gaze and head movements and then adjusts the signal processing to ensure that the targeted sound source can be optimally heard by the test person. The current prototype can be used in field experiments as well as in the lab.

The team is also working on establishing international standards for complex acoustic scenarios in hearing research and audiology in order to facilitate and enhance exchange between different laboratories. In addition, the CRC aims to develop new hearing-acoustic tests in virtual environments that enable researchers to better identify differences in individual perception. This should make it possible to design diagnostics and hearing aid rehabilitation measures that are optimally tailored to individual needs.

Researchers from the Jade University of Applied Sciences, the Fraunhofer Institute for Digital Media Technology IDMT, the Hörzentrum Oldenburg gGmbH, RWTH Aachen University and the Technical University of Munich are all involved in the project, which is scheduled to run for a total of twelve years.

The Collaborative Research Centre Hearing Acoustics brings together various disciplines, in particular acoustics, psychoacoustics, audiology, engineering sciences and physical modelling.

“In real life, the hearing situation changes constantly because people react to voices and sounds. For example, they turn their head towards the sound source, or shift their gaze in that direction. We call this the ‘acoustic communication loop’,” said Prof Hohmann. This dynamic loop had received little attention in hearing acoustics in the past, he notes.  

In the last few years the team has succeeded in incorporating the hearing aid into this acoustic communication loop. “We have developed a first prototype of the so-called ‘immersive hearing aid’ which constantly assesses the acoustic situation and identifies which sound source a test person is directing their attention towards at a given moment,” said Hohmann.

Among other factors, new perception models developed by the research team for use in different hearing situations have paved the way for this success. “These models predict how a test person will perceive a sound signal in a given situation – whether or not they will be able to follow a conversation in a noisy environment, for instance,” said Hohmann. Simulating hearing with and without hearing impairment in different hearing situations involving background noise and reverberation is essential for the development and evaluation of innovative methods for signal processing in hearing aids.

Another important result from the first stage is the “hearpiece” This is a high-quality earpiece for research purposes. Inserted in the ear and featuring several integrated microphones and small loudspeakers, the device can boost sound in exactly the same way as a hearing aid. The researchers can use it to test new algorithms for signal processing directly in the ear, for example.

The key feature is that the hearpiece is acoustically transparent, corresponding to normal hearing with an open ear. “Thanks to the interdisciplinary collaboration within the CRC we were able to combine acoustics and signal processing methods and have made considerable progress as a result,” says Hohmann.

The team has also developed an interactive, audiovisual virtual reality set-up in the lab for conducting hearing experiments with test subjects under controlled conditions. This allows real-life situations to be simulated more realistically than was previously possible.

The team created several complex audiovisual scenarios in which testers can immerse themselves, including a virtual restaurant, an underground station and a living room. These scenarios, together with the related data, have been made freely available to research laboratories across the world so that they can conduct their own hearing experiments.

The CRC team now plans to refine and merge its perception models, algorithms and applications. One goal is to develop algorithms for the hearpiece and the immersive hearing aid that can actively control noise depending on the acoustic scenario.

The long-term goal is for each hearing aid to constantly learn and get better at predicting which setting is optimal for the respective user in a specific situation. People with impaired hearing are to be able to enter the necessary feedback themselves via their smartphone. “However, we still have a lot of work to do before we reach this goal,” said Hohmann.

“In our ageing society it is becoming increasingly urgent to develop hearing aids and other communication aids that work effectively in difficult acoustic environments and really help people in daily life,” said Prof. Dr Ralph Bruder, president of the University of Oldenberg.

The Collaborative Research Centre complements the research conducted by the Cluster of Excellence Hearing4all, which is also led by researchers from the University of Oldenburg.

www.hz-ol.de

Related hearing aid articles

Other articles on eeNews Europe

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s