MENU

Deci, Intel team to optimize deep learning inference on Intel CPUs

Deci, Intel team to optimize deep learning inference on Intel CPUs

Market news |
By Rich Pell



Deci, one of the first companies to participate in the Intel Ignite startup accelerator, says it will now work with Intel to deploy innovative AI technologies to mutual customers. The collaboration, say the companies, takes a significant step towards enabling deep learning inference at scale on Intel CPUs, reducing costs and latency, and enabling new applications of deep learning inference.

New deep learning tasks can be performed in a real-time environment on edge devices and companies that use large-scale inference scenarios can dramatically cut cloud or datacenter cost, simply by changing the inference hardware from GPU to Intel CPU.

“By optimizing the AI models that run on Intel’s hardware, Deci enables customers to get even more speed,” says Deci CEO and co-founder Yonatan Geifman, “allowing for cost-effective and more general deep learning use cases on Intel CPUs. We are delighted to collaborate with Intel to deliver even greater value to our mutual customers and look forward to a successful partnership.”

The collaboration began with the MLPerf inference benchmark where on several popular Intel CPUs, Deci’s AutoNAC (Automated Neural Architecture Construction) technology accelerated the inference speed of the well-known ResNet-50 neural network, reducing the submitted models’ latency by a factor of up to 11.8x and increasing throughput by up to 11x. The AutoNAC technology uses machine learning to redesign any model and maximize its inference performance on any hardware – all while preserving its accuracy.

Monica Livingston, AI Solutions and Sales Director at Intel says, “Deci delivers optimized deep learning inference on Intel processors as highlighted in MLPerf. Optimizing advanced AI models on general purpose infrastructure based on Intel Xeon Scalable CPUs allows our customers to meet performance SLAs, reduce cost, decrease time to deployment, and gives them the ability to effectively scale.”

Pilots of Deci’s platform to select customers in the enterprise, cloud, communications, and media segments are being developed to enable them to scale up and further accelerate their deep learning usage on Intel CPUs. As results of these engagements are shared, an opportunity to scale the Deci platform to a broader base of customers is being planned.

Deci
Intel


If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s