The processes of inference and learning that form the backbone of AI typically take place in big servers, far removed from consumers. New models are changing all that, says the firm, as the more recent frameworks of Federated Learning, Distributed Learning, and Few-shot Learning can be deployed directly on consumers’ devices that have lower compute and smaller power budget, bringing AI to end users.
“This is the direction the market has increasingly been moving to,” says David Lobina, Research Analyst at ABI Research, “though it will take some time before the full benefits of these approaches become a reality, especially in the case of Few-Shot Learning, where a single individual smartphone would be able to learn from the data that it is itself collecting. This might well prove to be an attractive proposition for many, as it does not involve uploading data onto a cloud server, making for more secure and private data. In addition, devices can be highly personalized and localized as they can possess high situational awareness and better understanding of the local environments.”
According to the firm, it believes that it will take up to 10 years for such on-device learning and inference to be operative, and these will require adopting new technologies, such as neuromorphic chips. The shift will take place in more powerful consumer devices, such as autonomous vehicles and robots, before making its way into the likes of smartphones, wearables, and smart home devices.
Big players such as Intel, NVIDIA, and Qualcomm have been working on these models in recent years, which in addition to neuromorphic chipset players such as BrainChip and GrAI Matter Labs, have provided chips that offer improved performance on a variety of training and inference tasks. The take-up is still small, but it can potentially disrupt the market, says the firm.
“Indeed,” says Lobina, “these learning models have the potential to revolutionize a variety of sectors, most probably the fields of autonomous driving and the deployment of robots in public spaces, both of which are currently difficult to pull off, particularly in co-existence with other users. Federated Learning, Distributed Learning, and Few-shot Learning reduces the reliance on cloud infrastructure, allowing AI implementers to create low latency, localized, and privacy preserving AI that can deliver much better user experience for end users.”