Through an investment, NXP has signed an exclusive, strategic partnership with Au-Zone Technologies that will see NXP’s eIQ Machine Learning (ML) software development environment expanded with Au-Zone’s ML tools.
NXP has also been working with Arm as the lead technology partner to allow Arm’s Ethos-U microNPU (Neural Processing Unit) architecture to support applications processors. As a result of this collaberation, NXP will integrate the Ethos-U65 microNPU into its next generation of i.MX applications processors.
Au-Zone’s DeepView ML Tool Suite will provide eIQ with an intuitive, GUI and workflow, that will allow developers to import datasets and models, quickly train, and deploy NN models and ML workloads across NXP’s Edge processing portfolio. NXP’s eIQ-DeepViewML Tool Suite’s graph-level profiling ability will provide in-depth run-time insights to optimize NN model architectures, system parameters, and run-time performance. The addition of Au-Zone’s DeepView run-time inference engine to the open source inference technologies in NXP eIQ will allow the quick deployment and evaluation of ML workloads and performance across NXP devices. The run-time inference engine will optimize system memory usage and data movement for each SoC architecture.
“NXP’s scalable applications processors deliver an efficient product platform and a broad ecosystem for our customers to quickly deliver innovative systems,” said Ron Martino, senior vice president and general manager of Edge Processing business line at NXP Semiconductors. “Through these partnerships with both Arm and Au-Zone, in addition to technology developments within NXP, our goal is to continuously increase the efficiency of our processors while simultaneously increasing our customers’ productivity and reducing their time to market.”
Brad Scott, CEO of Au-Zone said, “We created DeepViewTM to provide developers with intuitive tools and inferencing technology, so this partnership represents a great union of world class silicon, run-time inference engine technology, and a development environment that will further accelerate the deployment of embedded ML features. This partnership builds on a decade of engineering collaboration with NXP and will serve as a catalyst to deliver more advanced Machine Learning technologies and turnkey solutions as OEM’s continue to transition inferencing to the Edge.”
NXP’s integration of the Arm Ethos-U65 microNPU is in addition to its previously announced i.MX 8M Plus applications processor with integrated NPU. NXP and Arm partnered to define the system-level aspects of the microNPU which can support up to 1 TOPS (512 parallel multiply-accumulate operations at 1GHz). The Ethos-U65 keeps the MCU-class power efficiency of the Ethos-U55 while expanding its applicability to higher performance Cortex-A-based SoCs. The Ethos-U65 microNPU works beside the Cortex-M core found in NXP’s i.MX families of heterogeneous SoCs.
“There has been a surge of AI and ML across industrial and IoT applications driving demand for more on-device ML capabilities,” said Dennis Laudick, vice president of Marketing, Machine Learning Group, at Arm. “The Ethos-U65 will power a new wave of edge AI, providing NXP customers with secure, reliable, and smart on-device intelligence.”
Arm Ethos-U65 will be available in future NXP’s i.MX applications processors. The eIQ-DeepViewML Tool Suite and DeepView run-time inference engine, integrated into eIQ, will be available Q1, 2021.