Bayesian decision theory on three-layer neural networks

We discuss the Bayesian decision theory on neural networks. In the two-category case where the state-conditional probabilities are normal, a three-layer neural network having d hidden layer units can approximate the posterior probability in L p ( R d , p ) , where d is the dimension of the space of...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neurocomputing (Amsterdam) 2005, Vol.63, p.209-228
Hauptverfasser: Ito, Yoshifusa, Srinivasan, Cidambi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 228
container_issue
container_start_page 209
container_title Neurocomputing (Amsterdam)
container_volume 63
creator Ito, Yoshifusa
Srinivasan, Cidambi
description We discuss the Bayesian decision theory on neural networks. In the two-category case where the state-conditional probabilities are normal, a three-layer neural network having d hidden layer units can approximate the posterior probability in L p ( R d , p ) , where d is the dimension of the space of observables. We extend this result to multicategory cases. Then, the number of the hidden layer units must be increased, but can be bounded by 1 2 d ( d + 1 ) irrespective of the number of categories if the neural network has direct connections between the input and output layers. In the case where the state-conditional probability is one of familiar probability distributions such as binomial, multinomial, Poisson, negative binomial distributions and so on, a two-layer neural network can approximate the posterior probability.
doi_str_mv 10.1016/j.neucom.2004.05.005
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_17317669</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0925231204003236</els_id><sourcerecordid>17317669</sourcerecordid><originalsourceid>FETCH-LOGICAL-c337t-f2884f5b439a8d76506c75a70391ef068d3bcd6bc51e4919fddf727df9988e023</originalsourceid><addsrcrecordid>eNp9kDtPwzAUhS0EEqXwDxgysSX4Eb8WJKh4SZVYYLZc-1q4pHGxU1D_PSlhZjpnOOdc3Q-hS4Ibgom4Xjc97FzaNBTjtsG8wZgfoRlRktaKKnGMZlhTXlNG6Ck6K2WNMZGE6hmSd3YPJdq-8uBiiamvhndIeV_9ugxQd2MiV-OFbLtRhu-UP8o5Ogm2K3Dxp3P09nD_uniqly-Pz4vbZe0Yk0MdqFJt4KuWaau8FBwLJ7mVmGkCAQvl2cp5sXKcQKuJDt4HSaUPWisFmLI5upp2tzl97qAMZhOLg66zPaRdMUQyIoXQY7Cdgi6nUjIEs81xY_PeEGwOlMzaTJTMgZLB3IyUxtrNVIPxia8I2RQXoXfgYwY3GJ_i_wM_wh1yWw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>17317669</pqid></control><display><type>article</type><title>Bayesian decision theory on three-layer neural networks</title><source>Access via ScienceDirect (Elsevier)</source><creator>Ito, Yoshifusa ; Srinivasan, Cidambi</creator><creatorcontrib>Ito, Yoshifusa ; Srinivasan, Cidambi</creatorcontrib><description>We discuss the Bayesian decision theory on neural networks. In the two-category case where the state-conditional probabilities are normal, a three-layer neural network having d hidden layer units can approximate the posterior probability in L p ( R d , p ) , where d is the dimension of the space of observables. We extend this result to multicategory cases. Then, the number of the hidden layer units must be increased, but can be bounded by 1 2 d ( d + 1 ) irrespective of the number of categories if the neural network has direct connections between the input and output layers. In the case where the state-conditional probability is one of familiar probability distributions such as binomial, multinomial, Poisson, negative binomial distributions and so on, a two-layer neural network can approximate the posterior probability.</description><identifier>ISSN: 0925-2312</identifier><identifier>EISSN: 1872-8286</identifier><identifier>DOI: 10.1016/j.neucom.2004.05.005</identifier><language>eng</language><publisher>Elsevier B.V</publisher><subject>Approximation ; Bayesian decision ; Direct connection ; Layered neural network ; Logistic transform</subject><ispartof>Neurocomputing (Amsterdam), 2005, Vol.63, p.209-228</ispartof><rights>2004 Elsevier B.V.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c337t-f2884f5b439a8d76506c75a70391ef068d3bcd6bc51e4919fddf727df9988e023</citedby><cites>FETCH-LOGICAL-c337t-f2884f5b439a8d76506c75a70391ef068d3bcd6bc51e4919fddf727df9988e023</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neucom.2004.05.005$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>315,781,785,3551,4025,27924,27925,27926,45996</link.rule.ids></links><search><creatorcontrib>Ito, Yoshifusa</creatorcontrib><creatorcontrib>Srinivasan, Cidambi</creatorcontrib><title>Bayesian decision theory on three-layer neural networks</title><title>Neurocomputing (Amsterdam)</title><description>We discuss the Bayesian decision theory on neural networks. In the two-category case where the state-conditional probabilities are normal, a three-layer neural network having d hidden layer units can approximate the posterior probability in L p ( R d , p ) , where d is the dimension of the space of observables. We extend this result to multicategory cases. Then, the number of the hidden layer units must be increased, but can be bounded by 1 2 d ( d + 1 ) irrespective of the number of categories if the neural network has direct connections between the input and output layers. In the case where the state-conditional probability is one of familiar probability distributions such as binomial, multinomial, Poisson, negative binomial distributions and so on, a two-layer neural network can approximate the posterior probability.</description><subject>Approximation</subject><subject>Bayesian decision</subject><subject>Direct connection</subject><subject>Layered neural network</subject><subject>Logistic transform</subject><issn>0925-2312</issn><issn>1872-8286</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2005</creationdate><recordtype>article</recordtype><recordid>eNp9kDtPwzAUhS0EEqXwDxgysSX4Eb8WJKh4SZVYYLZc-1q4pHGxU1D_PSlhZjpnOOdc3Q-hS4Ibgom4Xjc97FzaNBTjtsG8wZgfoRlRktaKKnGMZlhTXlNG6Ck6K2WNMZGE6hmSd3YPJdq-8uBiiamvhndIeV_9ugxQd2MiV-OFbLtRhu-UP8o5Ogm2K3Dxp3P09nD_uniqly-Pz4vbZe0Yk0MdqFJt4KuWaau8FBwLJ7mVmGkCAQvl2cp5sXKcQKuJDt4HSaUPWisFmLI5upp2tzl97qAMZhOLg66zPaRdMUQyIoXQY7Cdgi6nUjIEs81xY_PeEGwOlMzaTJTMgZLB3IyUxtrNVIPxia8I2RQXoXfgYwY3GJ_i_wM_wh1yWw</recordid><startdate>2005</startdate><enddate>2005</enddate><creator>Ito, Yoshifusa</creator><creator>Srinivasan, Cidambi</creator><general>Elsevier B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7QO</scope><scope>8FD</scope><scope>FR3</scope><scope>P64</scope></search><sort><creationdate>2005</creationdate><title>Bayesian decision theory on three-layer neural networks</title><author>Ito, Yoshifusa ; Srinivasan, Cidambi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c337t-f2884f5b439a8d76506c75a70391ef068d3bcd6bc51e4919fddf727df9988e023</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2005</creationdate><topic>Approximation</topic><topic>Bayesian decision</topic><topic>Direct connection</topic><topic>Layered neural network</topic><topic>Logistic transform</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ito, Yoshifusa</creatorcontrib><creatorcontrib>Srinivasan, Cidambi</creatorcontrib><collection>CrossRef</collection><collection>Biotechnology Research Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>Biotechnology and BioEngineering Abstracts</collection><jtitle>Neurocomputing (Amsterdam)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ito, Yoshifusa</au><au>Srinivasan, Cidambi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Bayesian decision theory on three-layer neural networks</atitle><jtitle>Neurocomputing (Amsterdam)</jtitle><date>2005</date><risdate>2005</risdate><volume>63</volume><spage>209</spage><epage>228</epage><pages>209-228</pages><issn>0925-2312</issn><eissn>1872-8286</eissn><abstract>We discuss the Bayesian decision theory on neural networks. In the two-category case where the state-conditional probabilities are normal, a three-layer neural network having d hidden layer units can approximate the posterior probability in L p ( R d , p ) , where d is the dimension of the space of observables. We extend this result to multicategory cases. Then, the number of the hidden layer units must be increased, but can be bounded by 1 2 d ( d + 1 ) irrespective of the number of categories if the neural network has direct connections between the input and output layers. In the case where the state-conditional probability is one of familiar probability distributions such as binomial, multinomial, Poisson, negative binomial distributions and so on, a two-layer neural network can approximate the posterior probability.</abstract><pub>Elsevier B.V</pub><doi>10.1016/j.neucom.2004.05.005</doi><tpages>20</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0925-2312
ispartof Neurocomputing (Amsterdam), 2005, Vol.63, p.209-228
issn 0925-2312
1872-8286
language eng
recordid cdi_proquest_miscellaneous_17317669
source Access via ScienceDirect (Elsevier)
subjects Approximation
Bayesian decision
Direct connection
Layered neural network
Logistic transform
title Bayesian decision theory on three-layer neural networks
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T14%3A01%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Bayesian%20decision%20theory%20on%20three-layer%20neural%20networks&rft.jtitle=Neurocomputing%20(Amsterdam)&rft.au=Ito,%20Yoshifusa&rft.date=2005&rft.volume=63&rft.spage=209&rft.epage=228&rft.pages=209-228&rft.issn=0925-2312&rft.eissn=1872-8286&rft_id=info:doi/10.1016/j.neucom.2004.05.005&rft_dat=%3Cproquest_cross%3E17317669%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=17317669&rft_id=info:pmid/&rft_els_id=S0925231204003236&rfr_iscdi=true