Back propagation simulations using limited precision calculations

The precision required for neural net algorithms is an important question facing hardware architects. The authors present simulation results that compare floating point and limited precision integer back-propagation simulators. Data sets from the neural network benchmark suite maintained by Carnegie...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Holt, J.L., Baker, T.E.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 126 vol.2
container_issue
container_start_page 121
container_title
container_volume ii
creator Holt, J.L.
Baker, T.E.
description The precision required for neural net algorithms is an important question facing hardware architects. The authors present simulation results that compare floating point and limited precision integer back-propagation simulators. Data sets from the neural network benchmark suite maintained by Carnegie Mellon University were used to compare integer and floating point implementations. The simulation results indicate that integer computation works quite well for the back-propagation algorithm. In all cases except one, the limited precision integer simulations performed as well as the floating point simulations. The effect of reducing the precision of the trained weights is also reported.< >
doi_str_mv 10.1109/IJCNN.1991.155324
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_155324</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>155324</ieee_id><sourcerecordid>155324</sourcerecordid><originalsourceid>FETCH-LOGICAL-c152t-c08e84aeef6b18327e072a7389348f9b1f8ad20de0d32653ad724eedc80800373</originalsourceid><addsrcrecordid>eNo1j81OwzAQhC0hJKD0AeCUF0jY9dqxcywRP0VVubTnyrU3lSFpozg98PYUCnP55vBppBHiDqFAhOph_lYvlwVWFRaoNUl1IW7AWCDAUuGVmKb0AacoDaW212L26Pxn1g-H3u3cGA_7LMXu2P7WlB1T3O-yNnZx5HCy2Mf043jX-n_pVlw2rk08_eNErJ-fVvVrvnh_mdezRe5RyzH3YNkqx9yUW7QkDYORzpCtSNmm2mJjXZAQGALJUpMLRirm4C1YADI0Effn3cjMm36InRu-NueT9A2U8kiT</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Back propagation simulations using limited precision calculations</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Holt, J.L. ; Baker, T.E.</creator><creatorcontrib>Holt, J.L. ; Baker, T.E.</creatorcontrib><description>The precision required for neural net algorithms is an important question facing hardware architects. The authors present simulation results that compare floating point and limited precision integer back-propagation simulators. Data sets from the neural network benchmark suite maintained by Carnegie Mellon University were used to compare integer and floating point implementations. The simulation results indicate that integer computation works quite well for the back-propagation algorithm. In all cases except one, the limited precision integer simulations performed as well as the floating point simulations. The effect of reducing the precision of the trained weights is also reported.&lt; &gt;</description><identifier>ISBN: 0780301641</identifier><identifier>ISBN: 9780780301641</identifier><identifier>DOI: 10.1109/IJCNN.1991.155324</identifier><language>eng</language><publisher>IEEE</publisher><subject>Algorithm design and analysis ; Artificial neural networks ; Backpropagation algorithms ; Computational modeling ; Computer networks ; Convergence ; Neural network hardware ; Neural networks ; Testing</subject><ispartof>IJCNN-91-Seattle International Joint Conference on Neural Networks, 1991, Vol.ii, p.121-126 vol.2</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c152t-c08e84aeef6b18327e072a7389348f9b1f8ad20de0d32653ad724eedc80800373</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/155324$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,4036,4037,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/155324$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Holt, J.L.</creatorcontrib><creatorcontrib>Baker, T.E.</creatorcontrib><title>Back propagation simulations using limited precision calculations</title><title>IJCNN-91-Seattle International Joint Conference on Neural Networks</title><addtitle>IJCNN</addtitle><description>The precision required for neural net algorithms is an important question facing hardware architects. The authors present simulation results that compare floating point and limited precision integer back-propagation simulators. Data sets from the neural network benchmark suite maintained by Carnegie Mellon University were used to compare integer and floating point implementations. The simulation results indicate that integer computation works quite well for the back-propagation algorithm. In all cases except one, the limited precision integer simulations performed as well as the floating point simulations. The effect of reducing the precision of the trained weights is also reported.&lt; &gt;</description><subject>Algorithm design and analysis</subject><subject>Artificial neural networks</subject><subject>Backpropagation algorithms</subject><subject>Computational modeling</subject><subject>Computer networks</subject><subject>Convergence</subject><subject>Neural network hardware</subject><subject>Neural networks</subject><subject>Testing</subject><isbn>0780301641</isbn><isbn>9780780301641</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1991</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo1j81OwzAQhC0hJKD0AeCUF0jY9dqxcywRP0VVubTnyrU3lSFpozg98PYUCnP55vBppBHiDqFAhOph_lYvlwVWFRaoNUl1IW7AWCDAUuGVmKb0AacoDaW212L26Pxn1g-H3u3cGA_7LMXu2P7WlB1T3O-yNnZx5HCy2Mf043jX-n_pVlw2rk08_eNErJ-fVvVrvnh_mdezRe5RyzH3YNkqx9yUW7QkDYORzpCtSNmm2mJjXZAQGALJUpMLRirm4C1YADI0Effn3cjMm36InRu-NueT9A2U8kiT</recordid><startdate>1991</startdate><enddate>1991</enddate><creator>Holt, J.L.</creator><creator>Baker, T.E.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>1991</creationdate><title>Back propagation simulations using limited precision calculations</title><author>Holt, J.L. ; Baker, T.E.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c152t-c08e84aeef6b18327e072a7389348f9b1f8ad20de0d32653ad724eedc80800373</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1991</creationdate><topic>Algorithm design and analysis</topic><topic>Artificial neural networks</topic><topic>Backpropagation algorithms</topic><topic>Computational modeling</topic><topic>Computer networks</topic><topic>Convergence</topic><topic>Neural network hardware</topic><topic>Neural networks</topic><topic>Testing</topic><toplevel>online_resources</toplevel><creatorcontrib>Holt, J.L.</creatorcontrib><creatorcontrib>Baker, T.E.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Holt, J.L.</au><au>Baker, T.E.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Back propagation simulations using limited precision calculations</atitle><btitle>IJCNN-91-Seattle International Joint Conference on Neural Networks</btitle><stitle>IJCNN</stitle><date>1991</date><risdate>1991</risdate><volume>ii</volume><spage>121</spage><epage>126 vol.2</epage><pages>121-126 vol.2</pages><isbn>0780301641</isbn><isbn>9780780301641</isbn><abstract>The precision required for neural net algorithms is an important question facing hardware architects. The authors present simulation results that compare floating point and limited precision integer back-propagation simulators. Data sets from the neural network benchmark suite maintained by Carnegie Mellon University were used to compare integer and floating point implementations. The simulation results indicate that integer computation works quite well for the back-propagation algorithm. In all cases except one, the limited precision integer simulations performed as well as the floating point simulations. The effect of reducing the precision of the trained weights is also reported.&lt; &gt;</abstract><pub>IEEE</pub><doi>10.1109/IJCNN.1991.155324</doi></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 0780301641
ispartof IJCNN-91-Seattle International Joint Conference on Neural Networks, 1991, Vol.ii, p.121-126 vol.2
issn
language eng
recordid cdi_ieee_primary_155324
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Algorithm design and analysis
Artificial neural networks
Backpropagation algorithms
Computational modeling
Computer networks
Convergence
Neural network hardware
Neural networks
Testing
title Back propagation simulations using limited precision calculations
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T09%3A48%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Back%20propagation%20simulations%20using%20limited%20precision%20calculations&rft.btitle=IJCNN-91-Seattle%20International%20Joint%20Conference%20on%20Neural%20Networks&rft.au=Holt,%20J.L.&rft.date=1991&rft.volume=ii&rft.spage=121&rft.epage=126%20vol.2&rft.pages=121-126%20vol.2&rft.isbn=0780301641&rft.isbn_list=9780780301641&rft_id=info:doi/10.1109/IJCNN.1991.155324&rft_dat=%3Cieee_6IE%3E155324%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=155324&rfr_iscdi=true