Programmable Tanh-, ELU-, Sigmoid-, and Sin-based Nonlinear Activation Functions for Neuromorphic Photonics

We demonstrate a programmable analog opto-electronic (OE) circuit that can be configured to provide a range of nonlinear activation functions for incoherent neuromorphic photonic circuits at up to 10 Gbaud line-rates. We present a set of well-known activation functions that are typically used to tra...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in quantum electronics 2023-11, Vol.29 (6: Photonic Signal Processing), p.1-10
Hauptverfasser: Pappas, C., Kovaios, S., Moralis-Pegios, M., Tsakyridis, A., Giamougiannis, G., Kirtas, M., Kerrebrouck, J. Van, Coudyzer, G., Yin, X., Passalis, N., Tefas, A., Pleros, N.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We demonstrate a programmable analog opto-electronic (OE) circuit that can be configured to provide a range of nonlinear activation functions for incoherent neuromorphic photonic circuits at up to 10 Gbaud line-rates. We present a set of well-known activation functions that are typically used to trainDL models including tanh-, sigmoid-, ReLU- and inverted ReLU-like activations, introducing also a series of novel photonic nonlinear functions that are referred to as Rectified Sine Squared (ReSin), Sine Squared with Exponential tail (ExpSin) and Double Sine Squared. Experimentalvalidation for all theseactivation functions is performed at 10 Gbaud operation. The ability of the mathematically modelled photonic activation functions to train Deep Neural Networks (DNNs) has been verified through their employment in Deep Learning (DL) models for MNIST and CIFAR10 classification purposes, comparing their performance against corresponding NNs that utilize an ideal ReLU activation function.
ISSN:1077-260X
1558-4542
DOI:10.1109/JSTQE.2023.3277118