DIET-SNN: A Low-Latency Spiking Neural Network With D irect I nput E ncoding and Leakage and T hreshold Optimization

Bioinspired spiking neural networks (SNNs), operating with asynchronous binary signals (or spikes) distributed over time, can potentially lead to greater computational efficiency on event-driven hardware. The state-of-the-art SNNs suffer from high inference latency, resulting from inefficient input...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2023-06, Vol.34 (6), p.3174-3182
Hauptverfasser: Rathi, Nitin, Roy, Kaushik
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 3182
container_issue 6
container_start_page 3174
container_title IEEE transaction on neural networks and learning systems
container_volume 34
creator Rathi, Nitin
Roy, Kaushik
description Bioinspired spiking neural networks (SNNs), operating with asynchronous binary signals (or spikes) distributed over time, can potentially lead to greater computational efficiency on event-driven hardware. The state-of-the-art SNNs suffer from high inference latency, resulting from inefficient input encoding and suboptimal settings of the neuron parameters (firing threshold and membrane leak). We propose DIET-SNN, a low-latency deep spiking network trained with gradient descent to optimize the membrane leak and the firing threshold along with other network parameters (weights). The membrane leak and threshold of each layer are optimized with end-to-end backpropagation to achieve competitive accuracy at reduced latency. The input layer directly processes the analog pixel values of an image without converting it to spike train. The first convolutional layer converts analog inputs into spikes where leaky-integrate-and-fire (LIF) neurons integrate the weighted inputs and generate an output spike when the membrane potential crosses the trained firing threshold. The trained membrane leak selectively attenuates the membrane potential, which increases activation sparsity in the network. The reduced latency combined with high activation sparsity provides massive improvements in computational efficiency. We evaluate DIET-SNN on image classification tasks from CIFAR and ImageNet datasets on VGG and ResNet architectures. We achieve top-1 accuracy of 69% with five timesteps (inference latency) on the ImageNet dataset with [Formula Omitted] less compute energy than an equivalent standard artificial neural network (ANN). In addition, DIET-SNN performs 20–[Formula Omitted] faster inference compared to other state-of-the-art SNN models.
doi_str_mv 10.1109/TNNLS.2021.3111897
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2821717615</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2821717615</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1205-356a18e74987b864129018c25fa31618bedb405f8cd4c1b98a343e035d0ee0533</originalsourceid><addsrcrecordid>eNo9kM9PwjAYhhujiQT5Bzw18Tzs165b540AKskyDszorem2DsaPdXZdCP71DiR-l_c9PHm_5EHoEcgYgETPaZLEqzElFMYMAEQU3qABhYB6lAlx-9_Dr3s0atst6S8gPPCjAXKzxTz1Vknygic4NkcvVk7X-QmvmmpX1Wuc6M6qfR_uaOwOf1Zug2e4sjp3eIHrpnN4juvcFGdY1QWOtdqptb70FG-sbjdmX-Bl46pD9aNcZeoHdFeqfatH1xyij9d5On334uXbYjqJvRwo4R7jgQKhQz8SYSYCH2hEQOSUl4pBACLTReYTXoq88HPIIqGYzzRhvCBaE87YED397TbWfHe6dXJrOlv3LyUVFEIIA-A9Rf-o3Jq2tbqUja0Oyp4kEHkWLC-C5VmwvApmv6Dua6Q</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2821717615</pqid></control><display><type>article</type><title>DIET-SNN: A Low-Latency Spiking Neural Network With D irect I nput E ncoding and Leakage and T hreshold Optimization</title><source>IEEE Electronic Library (IEL)</source><creator>Rathi, Nitin ; Roy, Kaushik</creator><creatorcontrib>Rathi, Nitin ; Roy, Kaushik</creatorcontrib><description>Bioinspired spiking neural networks (SNNs), operating with asynchronous binary signals (or spikes) distributed over time, can potentially lead to greater computational efficiency on event-driven hardware. The state-of-the-art SNNs suffer from high inference latency, resulting from inefficient input encoding and suboptimal settings of the neuron parameters (firing threshold and membrane leak). We propose DIET-SNN, a low-latency deep spiking network trained with gradient descent to optimize the membrane leak and the firing threshold along with other network parameters (weights). The membrane leak and threshold of each layer are optimized with end-to-end backpropagation to achieve competitive accuracy at reduced latency. The input layer directly processes the analog pixel values of an image without converting it to spike train. The first convolutional layer converts analog inputs into spikes where leaky-integrate-and-fire (LIF) neurons integrate the weighted inputs and generate an output spike when the membrane potential crosses the trained firing threshold. The trained membrane leak selectively attenuates the membrane potential, which increases activation sparsity in the network. The reduced latency combined with high activation sparsity provides massive improvements in computational efficiency. We evaluate DIET-SNN on image classification tasks from CIFAR and ImageNet datasets on VGG and ResNet architectures. We achieve top-1 accuracy of 69% with five timesteps (inference latency) on the ImageNet dataset with [Formula Omitted] less compute energy than an equivalent standard artificial neural network (ANN). In addition, DIET-SNN performs 20–[Formula Omitted] faster inference compared to other state-of-the-art SNN models.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2021.3111897</identifier><language>eng</language><publisher>Piscataway: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</publisher><subject>Accuracy ; Artificial neural networks ; Back propagation networks ; Computational efficiency ; Computational neuroscience ; Computing time ; Datasets ; Diet ; Firing pattern ; Image classification ; Inference ; Latency ; Membrane potential ; Membranes ; Network latency ; Neural networks ; Nutrient deficiency ; Optimization ; Parameters ; Sparsity ; Spiking</subject><ispartof>IEEE transaction on neural networks and learning systems, 2023-06, Vol.34 (6), p.3174-3182</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c1205-356a18e74987b864129018c25fa31618bedb405f8cd4c1b98a343e035d0ee0533</citedby><cites>FETCH-LOGICAL-c1205-356a18e74987b864129018c25fa31618bedb405f8cd4c1b98a343e035d0ee0533</cites><orcidid>0000-0002-0735-9695 ; 0000-0003-0597-064X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Rathi, Nitin</creatorcontrib><creatorcontrib>Roy, Kaushik</creatorcontrib><title>DIET-SNN: A Low-Latency Spiking Neural Network With D irect I nput E ncoding and Leakage and T hreshold Optimization</title><title>IEEE transaction on neural networks and learning systems</title><description>Bioinspired spiking neural networks (SNNs), operating with asynchronous binary signals (or spikes) distributed over time, can potentially lead to greater computational efficiency on event-driven hardware. The state-of-the-art SNNs suffer from high inference latency, resulting from inefficient input encoding and suboptimal settings of the neuron parameters (firing threshold and membrane leak). We propose DIET-SNN, a low-latency deep spiking network trained with gradient descent to optimize the membrane leak and the firing threshold along with other network parameters (weights). The membrane leak and threshold of each layer are optimized with end-to-end backpropagation to achieve competitive accuracy at reduced latency. The input layer directly processes the analog pixel values of an image without converting it to spike train. The first convolutional layer converts analog inputs into spikes where leaky-integrate-and-fire (LIF) neurons integrate the weighted inputs and generate an output spike when the membrane potential crosses the trained firing threshold. The trained membrane leak selectively attenuates the membrane potential, which increases activation sparsity in the network. The reduced latency combined with high activation sparsity provides massive improvements in computational efficiency. We evaluate DIET-SNN on image classification tasks from CIFAR and ImageNet datasets on VGG and ResNet architectures. We achieve top-1 accuracy of 69% with five timesteps (inference latency) on the ImageNet dataset with [Formula Omitted] less compute energy than an equivalent standard artificial neural network (ANN). In addition, DIET-SNN performs 20–[Formula Omitted] faster inference compared to other state-of-the-art SNN models.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Back propagation networks</subject><subject>Computational efficiency</subject><subject>Computational neuroscience</subject><subject>Computing time</subject><subject>Datasets</subject><subject>Diet</subject><subject>Firing pattern</subject><subject>Image classification</subject><subject>Inference</subject><subject>Latency</subject><subject>Membrane potential</subject><subject>Membranes</subject><subject>Network latency</subject><subject>Neural networks</subject><subject>Nutrient deficiency</subject><subject>Optimization</subject><subject>Parameters</subject><subject>Sparsity</subject><subject>Spiking</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNo9kM9PwjAYhhujiQT5Bzw18Tzs165b540AKskyDszorem2DsaPdXZdCP71DiR-l_c9PHm_5EHoEcgYgETPaZLEqzElFMYMAEQU3qABhYB6lAlx-9_Dr3s0atst6S8gPPCjAXKzxTz1Vknygic4NkcvVk7X-QmvmmpX1Wuc6M6qfR_uaOwOf1Zug2e4sjp3eIHrpnN4juvcFGdY1QWOtdqptb70FG-sbjdmX-Bl46pD9aNcZeoHdFeqfatH1xyij9d5On334uXbYjqJvRwo4R7jgQKhQz8SYSYCH2hEQOSUl4pBACLTReYTXoq88HPIIqGYzzRhvCBaE87YED397TbWfHe6dXJrOlv3LyUVFEIIA-A9Rf-o3Jq2tbqUja0Oyp4kEHkWLC-C5VmwvApmv6Dua6Q</recordid><startdate>202306</startdate><enddate>202306</enddate><creator>Rathi, Nitin</creator><creator>Roy, Kaushik</creator><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><orcidid>https://orcid.org/0000-0002-0735-9695</orcidid><orcidid>https://orcid.org/0000-0003-0597-064X</orcidid></search><sort><creationdate>202306</creationdate><title>DIET-SNN: A Low-Latency Spiking Neural Network With D irect I nput E ncoding and Leakage and T hreshold Optimization</title><author>Rathi, Nitin ; Roy, Kaushik</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1205-356a18e74987b864129018c25fa31618bedb405f8cd4c1b98a343e035d0ee0533</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Back propagation networks</topic><topic>Computational efficiency</topic><topic>Computational neuroscience</topic><topic>Computing time</topic><topic>Datasets</topic><topic>Diet</topic><topic>Firing pattern</topic><topic>Image classification</topic><topic>Inference</topic><topic>Latency</topic><topic>Membrane potential</topic><topic>Membranes</topic><topic>Network latency</topic><topic>Neural networks</topic><topic>Nutrient deficiency</topic><topic>Optimization</topic><topic>Parameters</topic><topic>Sparsity</topic><topic>Spiking</topic><toplevel>online_resources</toplevel><creatorcontrib>Rathi, Nitin</creatorcontrib><creatorcontrib>Roy, Kaushik</creatorcontrib><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium &amp; Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Rathi, Nitin</au><au>Roy, Kaushik</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DIET-SNN: A Low-Latency Spiking Neural Network With D irect I nput E ncoding and Leakage and T hreshold Optimization</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><date>2023-06</date><risdate>2023</risdate><volume>34</volume><issue>6</issue><spage>3174</spage><epage>3182</epage><pages>3174-3182</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><abstract>Bioinspired spiking neural networks (SNNs), operating with asynchronous binary signals (or spikes) distributed over time, can potentially lead to greater computational efficiency on event-driven hardware. The state-of-the-art SNNs suffer from high inference latency, resulting from inefficient input encoding and suboptimal settings of the neuron parameters (firing threshold and membrane leak). We propose DIET-SNN, a low-latency deep spiking network trained with gradient descent to optimize the membrane leak and the firing threshold along with other network parameters (weights). The membrane leak and threshold of each layer are optimized with end-to-end backpropagation to achieve competitive accuracy at reduced latency. The input layer directly processes the analog pixel values of an image without converting it to spike train. The first convolutional layer converts analog inputs into spikes where leaky-integrate-and-fire (LIF) neurons integrate the weighted inputs and generate an output spike when the membrane potential crosses the trained firing threshold. The trained membrane leak selectively attenuates the membrane potential, which increases activation sparsity in the network. The reduced latency combined with high activation sparsity provides massive improvements in computational efficiency. We evaluate DIET-SNN on image classification tasks from CIFAR and ImageNet datasets on VGG and ResNet architectures. We achieve top-1 accuracy of 69% with five timesteps (inference latency) on the ImageNet dataset with [Formula Omitted] less compute energy than an equivalent standard artificial neural network (ANN). In addition, DIET-SNN performs 20–[Formula Omitted] faster inference compared to other state-of-the-art SNN models.</abstract><cop>Piscataway</cop><pub>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</pub><doi>10.1109/TNNLS.2021.3111897</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0002-0735-9695</orcidid><orcidid>https://orcid.org/0000-0003-0597-064X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 2162-237X
ispartof IEEE transaction on neural networks and learning systems, 2023-06, Vol.34 (6), p.3174-3182
issn 2162-237X
2162-2388
language eng
recordid cdi_proquest_journals_2821717615
source IEEE Electronic Library (IEL)
subjects Accuracy
Artificial neural networks
Back propagation networks
Computational efficiency
Computational neuroscience
Computing time
Datasets
Diet
Firing pattern
Image classification
Inference
Latency
Membrane potential
Membranes
Network latency
Neural networks
Nutrient deficiency
Optimization
Parameters
Sparsity
Spiking
title DIET-SNN: A Low-Latency Spiking Neural Network With D irect I nput E ncoding and Leakage and T hreshold Optimization
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T04%3A57%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DIET-SNN:%20A%20Low-Latency%20Spiking%20Neural%20Network%20With%20D%20irect%20I%20nput%20E%20ncoding%20and%20Leakage%20and%20T%20hreshold%20Optimization&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Rathi,%20Nitin&rft.date=2023-06&rft.volume=34&rft.issue=6&rft.spage=3174&rft.epage=3182&rft.pages=3174-3182&rft.issn=2162-237X&rft.eissn=2162-2388&rft_id=info:doi/10.1109/TNNLS.2021.3111897&rft_dat=%3Cproquest_cross%3E2821717615%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2821717615&rft_id=info:pmid/&rfr_iscdi=true