MENU

AI attack picks up voices from MEMS microphones – update

AI attack picks up voices from MEMS microphones – update

Business news |
By Nick Flaherty

Cette publication existe aussi en Français


Researchers in the US and Japan have identified a security risk with MEMS microphones used in laptops and smart assistants such as Google Home.

These MEMS devices have a vulnerability where the electromagnetic emissions can be detected, even through a wall, using AI to reconstruct the voices that the microphone picks up.

The researchers at the University of Florida and the University of Electro-Communications in Japan also identified multiple ways to address the design flaw and say they have shared their work with manufacturers for potential fixes going forward, recommending spread spectrum clocking as s defence.

The researchers tested the MP34DT01-M from STMicroelectronics, the Knowles SPM0405 (now part of Synaptics), the TDK Invensense CS-41350 and T3902 and the Vesper VM3000 which is now part of Qualcomm. All have been contacted for comment.

“With an FM radio receiver and a copper antenna, you can eavesdrop on these microphones. That’s how easy this can be,” said Prof Sara Rampazzi at the University of Florida. “It costs maybe a hundred dollars, or even less.”

Each harmonic of the digital pulses used in the MEMS microphones retains acoustic information, allowing the original audio to be retrieved through simple FM demodulation using standard radio receivers and a simple antenna. An attacker can exploit this vulnerability to capture what the MEMS microphone hears remotely without installing malicious software or tampering with the device.

The researchers used standardized recordings of random sentences to test the attack, testing the technique through walls of varying thicknesses.

The attack achieves up to 94.2% accuracy in recognizing spoken digits, up to 2 metres from a laptop located behind a 25 cm concrete wall. Generative AI from OpenAI was used to reconstruct the voice with a transcription error rate as low as 6.5%

The researchers tested a range of laptops, the Google Home smart speaker, and headsets used for video conferencing. The eavesdropping worked best on the laptops, in part because their microphones were attached to long wires that served as antennas, amplifying the signal.

However the vulnerabilities are relatively easy to mitigate. Changing where the MEMS microphones are placed in laptops could avoid long cables, which amplify the radio leakage. Tweaks to the clock frequency of the audio processing by 1% also reduced the intelligibility of the signals without impacting on the performance, so the researchers recommend using spread spectrum clocking as defence.

ST points out that the MEMS microphone they are evaluating is a very old one and a product termination notice (PTN) has already been issued.

“But to tell the truth this is an intrinsic behavior of Digital PDM MEMS microphones,” said ST. “As the PDM signal is a set of square waves based on the clock of the microphone, it has all the odd harmonics, for instance at 3.072MHz the 11th harmonic is at 33.792MHz, that can be captured by a well-built receiver. At the same time it’s enough to shield the PDM wire at system level to avoid this kind of issue. The flat cable they are using in the laptop is not shielded, this is a perfect antenna for PDM signal.”

www.usenix.org/conference/usenixsecurity25/presentation/onishi

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s