
Porting TinyML to analog in-memory compute
Researchers in France have developped technology to use TinyML machine learning frameworks into analog in-memory compute for lower power embedded AI.
AnalogNAS can be used to port the TinyML AI to low power analog phase change memory in memory compute (IMC) arrays for a much simpler implementation of AI applications such as wake words or signal recognition.
The weights of linear, convolutional, and recurrent DNN layers are mapped to crossbar arrays of the Non-Volatile Memory (NVM) elements. By exploiting basic Kirchhoff’s circuit laws, Matrix-Vector Multiplications (MVM) can be performed by encoding inputs as Word-Line (WL) voltages and weights as device conductances. For most computations, this removes the need to pass data back and forth between Central Processing Units (CPUs) and memory.
The framework provides automated DNN design targeting deployment on the IMC inference accelerators and achieves higher accuracy than the latest digital models on a 64-core IMC chip based on Phase Change Memory (PCM).
Hadjer Benmeziane of CNRS worked with researchers from IBM in Zurich, New York and California on the development of the framework and an experimental chip.
Each core in the 64-core IMC chip consists of a crossbar array of 256×256 PCM-based unit-cells along with a local digital processing unit. Two networks for the CIFAR-10 image classification task were tested on the hardware: AnalogNAS T500 and the baseline ResNet32 networks#.
- In-memory computing startup launches to enable edge AI
- Compute-in-memory chip runs AI apps at fraction of the power
The computational inference graph of each network was exported and used to generate proprietary data-flows to be executed in-memory. As only hardware accuracy validation was being performed, all other operations aside from MVMs were performed on a host machine connected to the chip through an FPGA.
The measured hardware accuracy was 92.05% for T500 and 89.87% for Resnet32, significantly better than Resnet32 running on digital hardware.
