The ML processor can be used stand alone but also can be used with the OD processor, which is described as ARM's second generation of Object Detection processor. This is designed to work with 2D fields and with visual fields in particular.
The OD processor scans each frame at 60fps and provides a list of detected objects, along with their location within the scene. The devices detects human forms, faces, heads and shoulders, and can even determine the direction each person is facing. Object sizes detected can be as small as 50 by 60 pixels.
ARM claims the OD processor offers 80x the performance of a traditional DSP, and a significant improvement in detection quality relative to previous Arm technologies.
The OD processor is intended to be used as a pre-processor to detect regions of interest – and particularly people of interest – and it can be used with ARM Cortex CPUs, Mali GPUs and the ML processor.
The ML and OD processors can be deployed together or separately but they also can make use of ARM NN software and the ARM Compute Library (see ARM's soft launch for machine learning library).
Roadmap for ARM NN software as it bridges from TensorFlow, Caffe etc. to various processors. Source: ARM
ARM NN software, when used alongside the ARM Compute Library and CMSIS-NN is optimized for NNs and bridges the gap between NN frameworks such as TensorFlow, Caffe, and Android NN and the full range of Cortex CPUs, Mali GPUs, and ML processors.
Jem Davies, general manager of machine learning business at ARM, said that the advent of machine learning represents "the biggest inflection point in computing for more than a generation." He added that it will be done at the edge rather than in data centers wherever possible, for reasons of energy efficiency, latency, safety-criticality, economics and privacy.
ARM added that future ML products will enable developers to pick their point on a performance curve from sensors and smart speakers, to mobile, home entertainment, and beyond.
ARM stated that the ARM machine learning IP suite will be available for general availability in mid-2018.
Related links and articles: