Performance booster for automotive AI application developers
The toolkit enables the use of deep learning based algorithms for projects such as image recognition, autonomous driving, sensor data fusion, driver monitoring and other automotive applications. The toolkit enables customers to develop applications on desktop, cloud and GPU environments and port neural networks to an eIQ Autocompatible S32 processor. NXP’s toolkit and the inference engine specified for automotive electronics make integrating neural networks into applications with high security requirements much easier.
An example of this is the transition from conventional image recognition algorithms to those based on deep learning. The latter promises better accuracy and easier maintenance for object recognition and classification. However, implementation in vehicles has so far been hampered by significantly higher costs and the complexity of the system.
The new toolkit should significantly reduce the effort required for selecting and programming integrated calculation cores for all layers of a deep learning algorithm. For customers, this means faster time to market. The automated selection process increases performance by a factor of 30 compared to other embedded deep learning structures. This performance boost is achieved by making optimum use of existing resources. These benefits enable developers to evaluate, tune, and ultimately realize their applications for maximum performance.
NXP cites the key advantage of eIQ Auto and its integration on an S32V processor as being its compliance with development standards relevant to the automotive industry and all functional safety requirements. The Automotive SPICE-compliant inference engine integrated in eIQ Auto was developed according to strict specifications. The S32V processors offer a high degree of functional safety and support the ISO 26262 standard up to ASIL-C as well as IEC 61508 and DO 178.
NXP’s eIQ development environment for machine learning software enables the use of machine learning algorithms on MCUs, the i.MX RT Crossover MCUs and the SoCs of NXP’s i.MX product family. The eIQ family includes inference engines, compilers for neural networks, and optimized libraries.
More information: : www.nxp.com/eiq and www.nxp.com/eiqauto
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
