Lensless opto-electronic neural network with quantum dot nonlinear activation

With the swift advancement of neural networks and their expanding applications in many fields, optical neural networks have gradually become a feasible alternative to electrical neural networks due to their parallelism, high speed, low latency, and power consumption. Nonetheless, optical nonlinearit...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Photonics research (Washington, DC) DC), 2024-04, Vol.12 (4), p.682
Hauptverfasser: Shi, Wanxin, Jiang, Xi, Huang, Zheng, Li, Xue, Han, Yuyang, Yang, Sigang, Zhong, Haizheng, Chen, Hongwei
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:With the swift advancement of neural networks and their expanding applications in many fields, optical neural networks have gradually become a feasible alternative to electrical neural networks due to their parallelism, high speed, low latency, and power consumption. Nonetheless, optical nonlinearity is hard to realize in free-space optics, which restricts the potential of the architecture. To harness the benefits of optical parallelism while ensuring compatibility with natural light scenes, it becomes essential to implement two-dimensional spatial nonlinearity within an incoherent light environment. Here, we demonstrate a lensless opto-electrical neural network that incorporates optical nonlinearity, capable of performing convolution calculations and achieving nonlinear activation via a quantum dot film, all without an external power supply. Through simulation and experiments, the proposed nonlinear system can enhance the accuracy of image classification tasks, yielding a maximum improvement of 5.88% over linear models. The scheme shows a facile implementation of passive incoherent two-dimensional nonlinearities, paving the way for the applications of multilayer incoherent optical neural networks in the future.
ISSN:2327-9125
2327-9125
DOI:10.1364/PRJ.515349