Online training and pruning of photonic neural networks
Photonic neural networks (PNNs) have garnered significant interest due to their potential to offer low latency, high bandwidth, and energy efficiency in neuromorphic computing and machine learning. In PNNs, weights are photonic devices that make them susceptible to environmental factors and fabricat...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Photonic neural networks (PNNs) have garnered significant interest due to
their potential to offer low latency, high bandwidth, and energy efficiency in
neuromorphic computing and machine learning. In PNNs, weights are photonic
devices that make them susceptible to environmental factors and fabrication
variations. These vulnerabilities can result in inaccurate parameter mapping,
increased tuning power consumption, and reduced network performance when
conventional offline training methods are used. Here, we experimentally
demonstrate an online training and pruning method to address these challenges.
By incorporating a power-related term into the conventional loss function, our
approach minimizes the inference power budget. With this method, PNNs achieve
96% accuracy while reducing the power consumption by almost 45% on the Iris
dataset, despite fabrication and thermal variations. Furthermore, our method is
validated with a two-layer convolutional neural network (CNN) experiment for
radio-frequency (RF) fingerprinting applications and simulations across larger
CNNs on image classification datasets, including MNIST, CIFAR-10, and
CIFAR-100. This work represents a significant milestone in enabling adaptive
online training of PNNs and showcases their potential for real-world
applications. |
---|---|
DOI: | 10.48550/arxiv.2412.08184 |