But as people grow accustomed to talking to Google’s Assistant, Amazon’s Alexa and other voice interfaces, Knowles is trying to add voice-activated assistants to a broader range of battery-powered devices.
“Customers are asking, ‘If I don’t have to run some workload in the cloud, where should I put it instead?’ We say you should push it out to the device. But the current architecture in terms of cost and space is inadequate,” Michael Adell, Knowles’s vice president of intelligent audio for mobile, told Electronic Design. “We want to be more valuable in processing audio after it touches the microphone.”
Knowles is building advanced audio processing into the microphones themselves to do so. The company introduced a chip last month that lets wireless headphones respond to Alexa by saying the voice assistant’s wake word instead of pressing a physical button. The chip, the IA611, combines a microphone and a digital signal processor (DSP) core in a compact single package, reducing size and cost.
Knowles also released a reference design that lowers the bar for adding voice-activated Alexa to wireless earbuds. The solution, based on Knowles’s IA611 smart microphone and Bestechnic’s BES2000 Bluetooth SoC, will “help developers save time and money building new devices for Alexa,” said Priya Abani, Amazon’s general manager of Alexa Voice Services, in a statement last month. Chinese electronics maker Anker said it would use the solution in voice-activated versions of its wireless headphones.
Knowles is trying to address the challenges of adding Alexa to wireless earbuds and other devices with severely limited battery life. The chip, which Knowles calls a SmartMic, is designed to detect someone saying “Alexa” while blocking out loud background noise and nullifying echoes. The microphone lets the device’s other audio processors stay asleep while it carefully listens for the wake word.
Once the wake word is detected, the microphone can be used to rouse other parts of the system. The microphone can process voice commands—such as “Answer call” or “Raise the volume” or “Replay last song”—before the headphones send them over Bluetooth to the user’s smartphone. The 43 MHz processor inside has access to a small amount of memory to remember a short list of voice commands.
“Today, digital assistants are activated by physical means [with] a press of a button or a tap,” Peter Cooney, principal analyst at SAR Insight & Consulting, said in a statement. He added that “there will be rapid increase in the use of always-on voice triggers from 2019 onwards as battery density improves and more energy-efficient solutions such as smart microphones are implemented.”
Last year, shipments of wireless earbuds like Apple’s AirPods came to more than 40 million units, with total sales of $6 billion, according to estimates by SAR Insight & Consulting. Sales are projected to surge to $10 billion in 2019, with Apple maintaining its early market lead. Wireless earbuds will account for over 60% of all wireless stereo headphones shipped by 2023, the market researcher said.
Apple introduced its second generation of AirPods in March with the ability to wake its voice assistant by saying “Hey, Siri” instead of tapping a button on the side of the headphones. Samsung released a limited set of voice commands in April that can be used to control Galaxy Buds with its Bixby voice assistant. Bose has also added voice-activated Alexa to its latest line of wireless headphones.
This article first appeared in Electronic Design – www.electronicdesign.com