According to ABI Research, the edge AI chipset market will reach US$12.2bn in revenues, overtaking the cloud AI chipset market, which will reach US$11.9bn in 2025.
At present, the cloud is the center of AI, and most AI training and inference workloads happen in public and private clouds. Traditionally, the centralization of these workloads in the cloud brings the benefits of flexibility and scalability. However, the industry has witnessed a shift in the AI paradigm. Driven by the need for privacy, cybersecurity, and low latency, there is an emergence of performing AI training and inference workloads on gateways, devices, and sensors. Recent advancements in key domains, including connectivity to cloud computing, new AI learning architecture, and high-performance computational chipsets, have played a critical role in this shift.
"As enterprises start to look for AI solutions in the areas of image and object recognition, autonomous material handling, predictive maintenance, and human-machine interface for end devices, they need to resolve concerns around data privacy, power efficiency, low latency, and strong on-device computing performance," explains Lian Jye Su, Principal Analyst at ABI Research.
"Edge AI will be the answer to this. By integrating an AI chipset designed to perform high-speed inference and quantized federated learning or collaborative learning models, edge AI brings task automation and augmentation to device and sensor levels across various sectors. So much that it will grow and surpass the cloud AI chipset market in 2025."