MENU

Intel to ship inference AI processor later this year

Intel to ship inference AI processor later this year

Business news |
By Rich Pell



Announced at CES 2019, the Intel Nervana is designed to accelerate inference – the classification of data to “infer” a result – for companies with high workload demands. Inference is one of the main techniques used to develop and deploy deep learning based technologies, the other being training.

Designed to run pre-trained machine-learning algorithms more efficiently than competing devices, the Nervana will go into production later this year the company says. The processor is built on a 10-nanometer Intel process, and will include Ice Lake cores to handle general operations as well as neural network acceleration.

Further, the company announced, Facebook is one of its development partners on the project. The company had initially announced a partnership with the social media giant last year.

The company also announced that it is developing a similar neural network processor designed for neural network training. It too is expected to be available later this year.

Intel

Related articles:
Intel to launch commercial neural networking processor in 2019
Intel neural network processor promises to ‘revolutionize’ AI computing
New Intel VPU delivers AI at the edge
Intel brings Deep Learning to the engineering masses

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s