Forum goes back to the future in Antwerp: Page 3 of 4

April 17, 2018 // By Peter Clarke
Research institute IMEC (Leuven, Belgium) is returning to Antwerp in May to host its annual technology forum, which will provide a platform for more than 60 experts to discuss the future of nanotechnology.

One of the big pressure points in present day electronics is the so-called memory bottleneck, van den Hove said. With ever increasing amounts of data being captured by image and other sensors, performance is increasingly limited by the time and energy cost of moving data to and from processors. So, is the solution a change in memory device or in computer architecture?

"The answer is a combination of both," said Van den hove. "Scaling of some conventional memories continues – such as 3D NAND flash memory; there are also different innovations in memories, but also we are looking at how to compute at the edge so there is less need to move data to the cloud. We are also being helped by new computer concepts such as neuromorphic computing and cryo-temperature storage."

One of the ideas that comes out of neuromorphic computing is in-memory computing. "We gave a glimpse of this at ITF last year and we will give an update this year," Van den hove said.

There are at least two major strands of development in artificial intelligence hardware. One is the relatively near-term development of machine learning hardware based on the implementation and acceleration of simplified artificial neural networks.

While that can provide exciting possibilities, it does tend to depend on fairly conventional logic architectures and asymmetric training and inference procedures. The training can be based on very large data sets which can be good for accuracy but this is by its nature, energy consumptive, even when done in the cloud. More advanced neuromorphic architectures seek to follow the biological model of the brain more closely, with stricter models of the analog behaviour of synapses and axons and reinforcing and inhibiting impulses and with capability for independent, self-learning. While such systems could learn to construct their environment, rather like a baby learns, the electronic domain has the advantage of being able to share that learning efficiently, creating swarm intelligence.

"The impression I have is that both are evolving quickly and it depends on the application. We have fast pattern recognition algorithms but also autonomous system self-learning is going to be extremely important, for example is things like autonomous driving. It tends to impact in-memory implementation. In the first case there may be a tendency to use STT-MRAM [Spin-Transfer Torque Magnetic Random Access Memory] while in the second case with self-learning researchers are tending to look more to the analog behavior in Resistive RAM."

There is a break-out session on machine learning on the first day of the ITF and Nigel Toon, CEO of Graphcore Ltd., is due to discuss his company’s Intelligence Processing Unit (IPU), which has been designed to accelerate machine learning and AI applications.

Next: Top up on transistors


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.