MENU

AI-on-modules allow quick creation of Edge AI systems

AI-on-modules allow quick creation of Edge AI systems

By eeNews Europe



The AIoM products were created with NVIDIA and Intel. The series offers a hardware optimization strategy to help developers address performance, and SWaP requirements.

Heterogeneous computing is supported by the integration of one or more types of processing cores.

ADLINK’s AIoM offering includes:

  • Mobile PCI Express Module (MXM) GPU modules: The MXM GPU modules feature NVIDIA Quadro Embedded GPUs based on the Turing and Pascal architectures.

  • VPU-accelerated SMARC modules: Vizi-AI and Neuron Pi equipped with Intel Movidius Myriad X VPU. These enable developers to quicken the prototyping process. Commercially available options include tight version control and longevity support.

  • VPU-accelerated COM Express modules: High-performance modules that quickly integrate AI.

Additional form factors include PC/104, VPX, CompactPCI and XMC. Standards like USB3 Vision and GigE Vision are also supported.

The company will show the range at embedded world in a series of demonstrations that include an access control powered by ADLINK’s MXM GPU module and DLAP-3000-CFL platform, an inspection robot based on NVIDIA Jetson TX2, and a Vizi-AI development kit that simplifies scaling to other ADLINK AI products.

The integrated hardware/software approach provides flexibility. Developers can start on the low-cost Vizi-AI and choose a processor (e.g. CPU, GPU, VPU, TPU, NPU) at the time of deployment.

More information

www.adlink.com

embedded world

Hall 1, Stand #540

Related news

Complete AI product line based on NVIDIA technology

Edge computing box offers high virtual machine density

ADLINK and Lenovo team up for IoT and OEM solutions

Mouser stocking ADLINK Express-BD7 Computer-on-Modules

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

10s