Online training and pruning of photonic neural networks

Photonic neural networks (PNNs) have garnered significant interest due to their potential to offer low latency, high bandwidth, and energy efficiency in neuromorphic computing and machine learning. In PNNs, weights are photonic devices that make them susceptible to environmental factors and fabricat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zhang, Jiawei, Zhang, Weipeng, Xu, Tengji, Lederman, Joshua C, Doris, Eli A, Shastri, Bhavin J, Huang, Chaoran, Prucnal, Paul R
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Zhang, Jiawei
Zhang, Weipeng
Xu, Tengji
Lederman, Joshua C
Doris, Eli A
Shastri, Bhavin J
Huang, Chaoran
Prucnal, Paul R
description Photonic neural networks (PNNs) have garnered significant interest due to their potential to offer low latency, high bandwidth, and energy efficiency in neuromorphic computing and machine learning. In PNNs, weights are photonic devices that make them susceptible to environmental factors and fabrication variations. These vulnerabilities can result in inaccurate parameter mapping, increased tuning power consumption, and reduced network performance when conventional offline training methods are used. Here, we experimentally demonstrate an online training and pruning method to address these challenges. By incorporating a power-related term into the conventional loss function, our approach minimizes the inference power budget. With this method, PNNs achieve 96% accuracy while reducing the power consumption by almost 45% on the Iris dataset, despite fabrication and thermal variations. Furthermore, our method is validated with a two-layer convolutional neural network (CNN) experiment for radio-frequency (RF) fingerprinting applications and simulations across larger CNNs on image classification datasets, including MNIST, CIFAR-10, and CIFAR-100. This work represents a significant milestone in enabling adaptive online training of PNNs and showcases their potential for real-world applications.
doi_str_mv 10.48550/arxiv.2412.08184
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2412_08184</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2412_08184</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2412_081843</originalsourceid><addsrcrecordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjE00jOwMLQw4WQw98_LycxLVSgpSszMy8xLV0jMS1EoKCoFs_PTFAoy8kvy8zKTFfJSS4sSc4BUSXl-UXYxDwNrWmJOcSovlOZmkHdzDXH20AVbEV9QlJmbWFQZD7IqHmyVMWEVAAdZM9k</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Online training and pruning of photonic neural networks</title><source>arXiv.org</source><creator>Zhang, Jiawei ; Zhang, Weipeng ; Xu, Tengji ; Lederman, Joshua C ; Doris, Eli A ; Shastri, Bhavin J ; Huang, Chaoran ; Prucnal, Paul R</creator><creatorcontrib>Zhang, Jiawei ; Zhang, Weipeng ; Xu, Tengji ; Lederman, Joshua C ; Doris, Eli A ; Shastri, Bhavin J ; Huang, Chaoran ; Prucnal, Paul R</creatorcontrib><description>Photonic neural networks (PNNs) have garnered significant interest due to their potential to offer low latency, high bandwidth, and energy efficiency in neuromorphic computing and machine learning. In PNNs, weights are photonic devices that make them susceptible to environmental factors and fabrication variations. These vulnerabilities can result in inaccurate parameter mapping, increased tuning power consumption, and reduced network performance when conventional offline training methods are used. Here, we experimentally demonstrate an online training and pruning method to address these challenges. By incorporating a power-related term into the conventional loss function, our approach minimizes the inference power budget. With this method, PNNs achieve 96% accuracy while reducing the power consumption by almost 45% on the Iris dataset, despite fabrication and thermal variations. Furthermore, our method is validated with a two-layer convolutional neural network (CNN) experiment for radio-frequency (RF) fingerprinting applications and simulations across larger CNNs on image classification datasets, including MNIST, CIFAR-10, and CIFAR-100. This work represents a significant milestone in enabling adaptive online training of PNNs and showcases their potential for real-world applications.</description><identifier>DOI: 10.48550/arxiv.2412.08184</identifier><language>eng</language><subject>Physics - Optics</subject><creationdate>2024-12</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2412.08184$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2412.08184$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhang, Jiawei</creatorcontrib><creatorcontrib>Zhang, Weipeng</creatorcontrib><creatorcontrib>Xu, Tengji</creatorcontrib><creatorcontrib>Lederman, Joshua C</creatorcontrib><creatorcontrib>Doris, Eli A</creatorcontrib><creatorcontrib>Shastri, Bhavin J</creatorcontrib><creatorcontrib>Huang, Chaoran</creatorcontrib><creatorcontrib>Prucnal, Paul R</creatorcontrib><title>Online training and pruning of photonic neural networks</title><description>Photonic neural networks (PNNs) have garnered significant interest due to their potential to offer low latency, high bandwidth, and energy efficiency in neuromorphic computing and machine learning. In PNNs, weights are photonic devices that make them susceptible to environmental factors and fabrication variations. These vulnerabilities can result in inaccurate parameter mapping, increased tuning power consumption, and reduced network performance when conventional offline training methods are used. Here, we experimentally demonstrate an online training and pruning method to address these challenges. By incorporating a power-related term into the conventional loss function, our approach minimizes the inference power budget. With this method, PNNs achieve 96% accuracy while reducing the power consumption by almost 45% on the Iris dataset, despite fabrication and thermal variations. Furthermore, our method is validated with a two-layer convolutional neural network (CNN) experiment for radio-frequency (RF) fingerprinting applications and simulations across larger CNNs on image classification datasets, including MNIST, CIFAR-10, and CIFAR-100. This work represents a significant milestone in enabling adaptive online training of PNNs and showcases their potential for real-world applications.</description><subject>Physics - Optics</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjE00jOwMLQw4WQw98_LycxLVSgpSszMy8xLV0jMS1EoKCoFs_PTFAoy8kvy8zKTFfJSS4sSc4BUSXl-UXYxDwNrWmJOcSovlOZmkHdzDXH20AVbEV9QlJmbWFQZD7IqHmyVMWEVAAdZM9k</recordid><startdate>20241211</startdate><enddate>20241211</enddate><creator>Zhang, Jiawei</creator><creator>Zhang, Weipeng</creator><creator>Xu, Tengji</creator><creator>Lederman, Joshua C</creator><creator>Doris, Eli A</creator><creator>Shastri, Bhavin J</creator><creator>Huang, Chaoran</creator><creator>Prucnal, Paul R</creator><scope>GOX</scope></search><sort><creationdate>20241211</creationdate><title>Online training and pruning of photonic neural networks</title><author>Zhang, Jiawei ; Zhang, Weipeng ; Xu, Tengji ; Lederman, Joshua C ; Doris, Eli A ; Shastri, Bhavin J ; Huang, Chaoran ; Prucnal, Paul R</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2412_081843</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Physics - Optics</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Jiawei</creatorcontrib><creatorcontrib>Zhang, Weipeng</creatorcontrib><creatorcontrib>Xu, Tengji</creatorcontrib><creatorcontrib>Lederman, Joshua C</creatorcontrib><creatorcontrib>Doris, Eli A</creatorcontrib><creatorcontrib>Shastri, Bhavin J</creatorcontrib><creatorcontrib>Huang, Chaoran</creatorcontrib><creatorcontrib>Prucnal, Paul R</creatorcontrib><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhang, Jiawei</au><au>Zhang, Weipeng</au><au>Xu, Tengji</au><au>Lederman, Joshua C</au><au>Doris, Eli A</au><au>Shastri, Bhavin J</au><au>Huang, Chaoran</au><au>Prucnal, Paul R</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Online training and pruning of photonic neural networks</atitle><date>2024-12-11</date><risdate>2024</risdate><abstract>Photonic neural networks (PNNs) have garnered significant interest due to their potential to offer low latency, high bandwidth, and energy efficiency in neuromorphic computing and machine learning. In PNNs, weights are photonic devices that make them susceptible to environmental factors and fabrication variations. These vulnerabilities can result in inaccurate parameter mapping, increased tuning power consumption, and reduced network performance when conventional offline training methods are used. Here, we experimentally demonstrate an online training and pruning method to address these challenges. By incorporating a power-related term into the conventional loss function, our approach minimizes the inference power budget. With this method, PNNs achieve 96% accuracy while reducing the power consumption by almost 45% on the Iris dataset, despite fabrication and thermal variations. Furthermore, our method is validated with a two-layer convolutional neural network (CNN) experiment for radio-frequency (RF) fingerprinting applications and simulations across larger CNNs on image classification datasets, including MNIST, CIFAR-10, and CIFAR-100. This work represents a significant milestone in enabling adaptive online training of PNNs and showcases their potential for real-world applications.</abstract><doi>10.48550/arxiv.2412.08184</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2412.08184
ispartof
issn
language eng
recordid cdi_arxiv_primary_2412_08184
source arXiv.org
subjects Physics - Optics
title Online training and pruning of photonic neural networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T22%3A13%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Online%20training%20and%20pruning%20of%20photonic%20neural%20networks&rft.au=Zhang,%20Jiawei&rft.date=2024-12-11&rft_id=info:doi/10.48550/arxiv.2412.08184&rft_dat=%3Carxiv_GOX%3E2412_08184%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true