
Optical chip can train machine learning hardware
Researchers from the George Washington University, Queens University, University of British Columbia and Princeton University say they have developed an optical chip that can train machine learning hardware. The chip is designed to address the surging AI “appetite” that leaves an ever-widening gap between computer hardware and demand for AI.
Photonic integrated circuits have emerged as a possible solution to deliver higher computing performance, as measured by the number of operations performed per second per watt used, or TOPS/W. However, say the researchers, though they’ve demonstrated improved core operations in machine intelligence used for data classification, photonic chips have yet to improve the actual front-end learning and machine training process.
Machine learning is a two-step procedure: First, data is used to train the system, and then other data is used to test the performance of the AI system. In their work, the researchers set out to do just that.
After one training step, the researchers observed an error and reconfigured the hardware for a second training cycle followed by additional training cycles until a sufficient AI performance was reached (e.g., the system was able to correctly label objects appearing in a movie). Thus far, photonic chips have only demonstrated an ability to classify and infer information from data. Now, say the researchers, they have made it possible to speed up the training step itself.
This added AI capability is part of a larger effort around photonic tensor cores and other electronic-photonic application-specific integrated circuits (ASICs) that leverage photonic chip manufacturing for machine learning and AI applications.
“This novel hardware will speed up the training of machine learning systems and harness the best of what both photonics and electronic chips have to offer,” says Volker Sorger, Professor of Electrical and Computer Engineering at the George Washington University and founder of the start-up company Optelligence. “It is a major leap forward for AI hardware acceleration. These are the kinds of advancements we need in the semiconductor industry as underscored by the recently passed CHIPS Act.”
Bhavin Shastri, Assistant Professor of Physics Department Queens University adds, “The training of AI systems costs a significant amount of energy and carbon footprint. For example, a single AI transformer takes about five times as much CO2 in electricity as a gasoline car spends in its lifetime. Our training on photonic chips will help to reduce this overhead.”
For more, see “Silicon photonic architecture for training deep neural networks with direct feedback alignment.”
