fbpx

Blog Page

Uncategorized

Teaching photonic chips to learn – EurekAlert

George Washington University
image: A picture of the chip used for this work. view more 
Credit: The George Washington University/Queens University
SUMMARY
A multi-institution research team has developed an optical chip that can train machine learning hardware. 
THE SITUATION
Machine learning applications skyrocketed to $165B annually, according to a recent report from McKinsey. But, before a machine can perform intelligence tasks such as recognizing the details of an image, it must be trained. Training of modern-day artificial intelligence (AI) systems like Tesla’s autopilot costs several million dollars in electric power consumption and requires supercomputer-like infrastructure. This surging AI “appetite” leaves an ever-widening gap between computer hardware and demand for AI. Photonic integrated circuits, or simply optical chips, have emerged as a possible solution to deliver higher computing performance, as measured by the number of operations performed per second per watt used, or TOPS/W. However, though they’ve demonstrated improved core operations in machine intelligence used for data classification, photonic chips have yet to improve the actual front-end learning and machine training process. 
THE SOLUTION
Machine learning is a two-step procedure. First, data is used to train the system and then other data is used to test the performance of the AI system. IIn a new paper, a team of researchers from the George Washington University, Queens University, University of British Columbia and Princeton University set out to do just that. After one training step, the team  observed an error and reconfigured the hardware for a second  training cycle followed by additional training cycles  until a sufficient AI performance was reached (e.g. the system is able to correctly label objects appearing in a movie). Thus far, photonic chips have only demonstrated an ability to classify and infer information from data. Now, researchers have made it possible to speed up the training step itself.
This added AI capability is part of a larger effort around photonic tensor cores and other electronic-photonic application-specific integrated circuits (ASIC) that leverage photonic chip manufacturing for machine learning and AI applications.
FROM THE RESEARCHERS
"This novel hardware will speed up the training of machine learning systems and harness the best of what both photonics and electronic chips have to offer. It is a major leap forward for AI hardware acceleration. These are the kinds of advancements we need in the semiconductor industry as underscored by the recently passed CHIPS Act.” 
Volker Sorger, Professor of Electrical and Computer Engineering at the George Washington University and founder of the start-up company Optelligence. 
"The training of AI systems costs a significant amount of energy and carbon footprint. For example, a single AI transformer takes about five times as much CO2 in electricity as a gasoline car spends in its lifetime. Our training on photonic chips will help to reduce this overhead.” 
Bhavin Shastri, Assistant Professor of Physics Department Queens University
PUBLICATION INFORMATION
The paper “Silicon Photonic Architecture for Training Deep Neural Networks with Direct Feedback Alignment” was published today in the journal OPTICA. To schedule an interview with Dr. Sorger, please contact Cate Douglass  at [email protected].
Optica
Silicon photonic architecture for training deep neural networks with direct feedback alignment
22-Nov-2022
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
Media Contact
Cate Douglass
George Washington University
[email protected]

George Washington University
EurekAlert! The Global Source for Science News
AAAS - American Association for the Advancement of Science
Copyright © 2022 by the American Association for the Advancement of Science (AAAS)
Copyright © 2022 by the American Association for the Advancement of Science (AAAS)

source

× How can I help you?