Neural network teaches itself to count cars
BrainChip’s neural network processor is known as SNAP and uses signal spikes as a means of data transfer and a method known as Spike Time Dependent Plasticity (STDP) for learning (see BrainChip provides details of neural network architecture).
The AVFE is a breakthrough in that has demonstrated unsupervised learning from a visual data stream with implications for applications such as collision avoidance in autonomous driving and drones.
The AVFE on SNAP is able to process 100 million visual events per second. And within seconds learns and identifies patterns in the image stream, BrainChip said in a regulatory statement for the Australian Stock Exchange. The AVFE/SNAP was attached to a Davis artificial retina purchased from the developer Inilabs GmbH (Zurich, Switzerland) as a source of streaming digital video information.
The Davis Dynamic Vision Sensor is an artificial retina that has an AER (Address Event Representation) interface, the same interface that is used by SNAP. Rather than outputting frames of video, each pixel outputs one or more spikes whenever the contrast changes.
Potential applications for the AVFE running on SNAP and linked to an appropriate source include collision avoidance systems in road vehicles and drones, anomaly detection, surveillance and medical imaging.
The system initially has no knowledge of the contents of an input stream. The system learns autonomously by repetition and intensity, and starts to find patterns in the image stream. This image stream can originate from a visible image sensor – such as Davis – but alternatively from an appropriately engineered radar or ultrasound source.
The AVFE was tested on a highway in Pasadena, California, in a trial run lasting 78.5 seconds. The SNAP spiking neural network learned to recognize cars and started counting them in real time.
Peter van der Made, BrainChip CEO and Inventor of the SNAP neural processor said: “We are very excited about this significant advancement. It shows that BrainChips neural processor SNAP acquires information and learns without human supervision from visual input.”
The development of AVFE has prompted BrainChip to expand its commercial efforts and form a partnership with Applied Brain Research Inc. (Waterloo, Ontario). The two companies have entered into joint development and marketing agreement.
Related links and articles:
News articles:
Startup wants to be the ARM of neuromorphic cores
BrainChip provides details of neural network architecture
Startup plans neural network front-end for sensor systems
How will deep learning change SoCs?
IBM True North puts brain on a chip