Deep Gaussian processes and infinite neural networks for the analysis of EEG signals in Alzheimer’s diseases

Deep neural network models (DGPs) can be represented hierarchically by a sequential composition of layers. When the prior distribution over the weights and biases are independently identically distributed, there is an equivalence with Gaussian processes (GP) in the limit of an infinite net[1]work wi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Revista de Matemática Teoría y Aplicaciones 2022-12, Vol.29 (2), p.289-312
Hauptverfasser: Román, Krishna, Cumbicus, Andy, Infante, Saba, Fonseca-Delgado, Rigoberto
Format: Artikel
Sprache:eng ; por
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 312
container_issue 2
container_start_page 289
container_title Revista de Matemática Teoría y Aplicaciones
container_volume 29
creator Román, Krishna
Cumbicus, Andy
Infante, Saba
Fonseca-Delgado, Rigoberto
description Deep neural network models (DGPs) can be represented hierarchically by a sequential composition of layers. When the prior distribution over the weights and biases are independently identically distributed, there is an equivalence with Gaussian processes (GP) in the limit of an infinite net[1]work width. DGPs are non-parametric statistical models used to character[1]ize patterns of complex non-linear systems due to their flexibility, greater generalization capacity, and a natural way of making inferences about the parameters and states of the system. This article proposes a hierarchi[1]cal Bayesian structure to model the weights and biases of a deep neural network. We deduce a general formula to calculate the integrals of Gaussian processes with non-linear transfer densities and obtain a kernel to estimate the covariance functions. In the methodology, we conduct an empirical study analyzing an electroencephalogram (EEG) database for diagnosing Alzheimer’s disease. Additionally, the DGPs models are esti[1]mated and compared with the NN models for 5, 10, 50, 100, 500, and 1000 neurons in the hidden layer, considering two transfer functions: Recti[1]fied Linear Unit (ReLU) and hyperbolic Tangent (Tanh). The results show good performance in the classification of the signals. Finally, we use the mean square error as a goodness of fit measure to validate the proposed models, obtaining low estimation errors.
doi_str_mv 10.15517/rmta.v29i2.48885
format Article
fullrecord <record><control><sourceid>scielo_dialn</sourceid><recordid>TN_cdi_dialnet_primary_oai_dialnet_unirioja_es_ART0001598926</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><scielo_id>S1409_24332022000200289</scielo_id><sourcerecordid>S1409_24332022000200289</sourcerecordid><originalsourceid>FETCH-LOGICAL-c1349-b6614d52874e6f7fcb6e70b877be3661c3fac47d940696cdf26f5b857fd83d203</originalsourceid><addsrcrecordid>eNo9kdtKAzEQhoMoWKoP4F1eYGuOmyx4U2qtgiB4uA7Z3YmmbjclaZV65Wv4ej6JaateDAN_5pvJzI_QGSUjKiVV53GxsqM3Vnk2ElpreYAGjFFZcK74IRpQQaqCCc6P0WlKviZSEio05wPUXwIs8cyus257vIyhgZQgYdu32PfO934FuId1tF1Oq_cQXxN2IeLVC-Qi222STzg4PJ3OcPLPWUkZxOPu4wX8AuL351fCrU9gc9sTdORyAZz-5iF6upo-Tq6L27vZzWR8WzSUi6qoy5KKVjKtBJROuaYuQZFaK1UDz28Nd7YRqq0EKauyaR0rnay1VK7VvGWED9HFvm_rbZe_bZbRL2zcmGC9-dPWvY8-zK2BZMb3j4QQKitdsTLjoz2eGg9dMPOwjtvFzMP2lGZ7SkYYy0QOpqsM0D3QxJBSBPc_kBKz88hsPTI7j8zOI_4DwSiGuQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Deep Gaussian processes and infinite neural networks for the analysis of EEG signals in Alzheimer’s diseases</title><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Dialnet</source><creator>Román, Krishna ; Cumbicus, Andy ; Infante, Saba ; Fonseca-Delgado, Rigoberto</creator><creatorcontrib>Román, Krishna ; Cumbicus, Andy ; Infante, Saba ; Fonseca-Delgado, Rigoberto</creatorcontrib><description>Deep neural network models (DGPs) can be represented hierarchically by a sequential composition of layers. When the prior distribution over the weights and biases are independently identically distributed, there is an equivalence with Gaussian processes (GP) in the limit of an infinite net[1]work width. DGPs are non-parametric statistical models used to character[1]ize patterns of complex non-linear systems due to their flexibility, greater generalization capacity, and a natural way of making inferences about the parameters and states of the system. This article proposes a hierarchi[1]cal Bayesian structure to model the weights and biases of a deep neural network. We deduce a general formula to calculate the integrals of Gaussian processes with non-linear transfer densities and obtain a kernel to estimate the covariance functions. In the methodology, we conduct an empirical study analyzing an electroencephalogram (EEG) database for diagnosing Alzheimer’s disease. Additionally, the DGPs models are esti[1]mated and compared with the NN models for 5, 10, 50, 100, 500, and 1000 neurons in the hidden layer, considering two transfer functions: Recti[1]fied Linear Unit (ReLU) and hyperbolic Tangent (Tanh). The results show good performance in the classification of the signals. Finally, we use the mean square error as a goodness of fit measure to validate the proposed models, obtaining low estimation errors.</description><identifier>ISSN: 1409-2433</identifier><identifier>ISSN: 2215-3373</identifier><identifier>EISSN: 2215-3373</identifier><identifier>DOI: 10.15517/rmta.v29i2.48885</identifier><language>eng ; por</language><publisher>Centro de Investigaciones en Matemática Pura y Aplicada (CIMPA) y Escuela de Matemática, San José, Costa Rica</publisher><subject>Alzheimer disease ; deep Gaussian process ; electroencefalogramas ; electroencephalogram ; enfermedad de Alzheimer ; Mathematics ; Mathematics, Applied ; procesos gausianos profundos</subject><ispartof>Revista de Matemática Teoría y Aplicaciones, 2022-12, Vol.29 (2), p.289-312</ispartof><rights>This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.</rights><rights>LICENCIA DE USO: Los documentos a texto completo incluidos en Dialnet son de acceso libre y propiedad de sus autores y/o editores. Por tanto, cualquier acto de reproducción, distribución, comunicación pública y/o transformación total o parcial requiere el consentimiento expreso y escrito de aquéllos. Cualquier enlace al texto completo de estos documentos deberá hacerse a través de la URL oficial de éstos en Dialnet. Más información: https://dialnet.unirioja.es/info/derechosOAI | INTELLECTUAL PROPERTY RIGHTS STATEMENT: Full text documents hosted by Dialnet are protected by copyright and/or related rights. This digital object is accessible without charge, but its use is subject to the licensing conditions set by its authors or editors. Unless expressly stated otherwise in the licensing conditions, you are free to linking, browsing, printing and making a copy for your own personal purposes. All other acts of reproduction and communication to the public are subject to the licensing conditions expressed by editors and authors and require consent from them. Any link to this document should be made using its official URL in Dialnet. More info: https://dialnet.unirioja.es/info/derechosOAI</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0001-7834-7790 ; 0000-0002-8890-3911 ; 0000-0002-2714-3517 ; 0000-0001-8883-2730</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>230,314,780,784,864,874,885,27924,27925</link.rule.ids></links><search><creatorcontrib>Román, Krishna</creatorcontrib><creatorcontrib>Cumbicus, Andy</creatorcontrib><creatorcontrib>Infante, Saba</creatorcontrib><creatorcontrib>Fonseca-Delgado, Rigoberto</creatorcontrib><title>Deep Gaussian processes and infinite neural networks for the analysis of EEG signals in Alzheimer’s diseases</title><title>Revista de Matemática Teoría y Aplicaciones</title><addtitle>Rev. Mat</addtitle><description>Deep neural network models (DGPs) can be represented hierarchically by a sequential composition of layers. When the prior distribution over the weights and biases are independently identically distributed, there is an equivalence with Gaussian processes (GP) in the limit of an infinite net[1]work width. DGPs are non-parametric statistical models used to character[1]ize patterns of complex non-linear systems due to their flexibility, greater generalization capacity, and a natural way of making inferences about the parameters and states of the system. This article proposes a hierarchi[1]cal Bayesian structure to model the weights and biases of a deep neural network. We deduce a general formula to calculate the integrals of Gaussian processes with non-linear transfer densities and obtain a kernel to estimate the covariance functions. In the methodology, we conduct an empirical study analyzing an electroencephalogram (EEG) database for diagnosing Alzheimer’s disease. Additionally, the DGPs models are esti[1]mated and compared with the NN models for 5, 10, 50, 100, 500, and 1000 neurons in the hidden layer, considering two transfer functions: Recti[1]fied Linear Unit (ReLU) and hyperbolic Tangent (Tanh). The results show good performance in the classification of the signals. Finally, we use the mean square error as a goodness of fit measure to validate the proposed models, obtaining low estimation errors.</description><subject>Alzheimer disease</subject><subject>deep Gaussian process</subject><subject>electroencefalogramas</subject><subject>electroencephalogram</subject><subject>enfermedad de Alzheimer</subject><subject>Mathematics</subject><subject>Mathematics, Applied</subject><subject>procesos gausianos profundos</subject><issn>1409-2433</issn><issn>2215-3373</issn><issn>2215-3373</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>FKZ</sourceid><recordid>eNo9kdtKAzEQhoMoWKoP4F1eYGuOmyx4U2qtgiB4uA7Z3YmmbjclaZV65Wv4ej6JaateDAN_5pvJzI_QGSUjKiVV53GxsqM3Vnk2ElpreYAGjFFZcK74IRpQQaqCCc6P0WlKviZSEio05wPUXwIs8cyus257vIyhgZQgYdu32PfO934FuId1tF1Oq_cQXxN2IeLVC-Qi222STzg4PJ3OcPLPWUkZxOPu4wX8AuL351fCrU9gc9sTdORyAZz-5iF6upo-Tq6L27vZzWR8WzSUi6qoy5KKVjKtBJROuaYuQZFaK1UDz28Nd7YRqq0EKauyaR0rnay1VK7VvGWED9HFvm_rbZe_bZbRL2zcmGC9-dPWvY8-zK2BZMb3j4QQKitdsTLjoz2eGg9dMPOwjtvFzMP2lGZ7SkYYy0QOpqsM0D3QxJBSBPc_kBKz88hsPTI7j8zOI_4DwSiGuQ</recordid><startdate>20221201</startdate><enddate>20221201</enddate><creator>Román, Krishna</creator><creator>Cumbicus, Andy</creator><creator>Infante, Saba</creator><creator>Fonseca-Delgado, Rigoberto</creator><general>Centro de Investigaciones en Matemática Pura y Aplicada (CIMPA) y Escuela de Matemática, San José, Costa Rica</general><scope>AAYXX</scope><scope>CITATION</scope><scope>GPN</scope><scope>AGMXS</scope><scope>FKZ</scope><orcidid>https://orcid.org/0000-0001-7834-7790</orcidid><orcidid>https://orcid.org/0000-0002-8890-3911</orcidid><orcidid>https://orcid.org/0000-0002-2714-3517</orcidid><orcidid>https://orcid.org/0000-0001-8883-2730</orcidid></search><sort><creationdate>20221201</creationdate><title>Deep Gaussian processes and infinite neural networks for the analysis of EEG signals in Alzheimer’s diseases</title><author>Román, Krishna ; Cumbicus, Andy ; Infante, Saba ; Fonseca-Delgado, Rigoberto</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c1349-b6614d52874e6f7fcb6e70b877be3661c3fac47d940696cdf26f5b857fd83d203</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng ; por</language><creationdate>2022</creationdate><topic>Alzheimer disease</topic><topic>deep Gaussian process</topic><topic>electroencefalogramas</topic><topic>electroencephalogram</topic><topic>enfermedad de Alzheimer</topic><topic>Mathematics</topic><topic>Mathematics, Applied</topic><topic>procesos gausianos profundos</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Román, Krishna</creatorcontrib><creatorcontrib>Cumbicus, Andy</creatorcontrib><creatorcontrib>Infante, Saba</creatorcontrib><creatorcontrib>Fonseca-Delgado, Rigoberto</creatorcontrib><collection>CrossRef</collection><collection>SciELO</collection><collection>Dialnet (Open Access Full Text)</collection><collection>Dialnet</collection><jtitle>Revista de Matemática Teoría y Aplicaciones</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Román, Krishna</au><au>Cumbicus, Andy</au><au>Infante, Saba</au><au>Fonseca-Delgado, Rigoberto</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep Gaussian processes and infinite neural networks for the analysis of EEG signals in Alzheimer’s diseases</atitle><jtitle>Revista de Matemática Teoría y Aplicaciones</jtitle><addtitle>Rev. Mat</addtitle><date>2022-12-01</date><risdate>2022</risdate><volume>29</volume><issue>2</issue><spage>289</spage><epage>312</epage><pages>289-312</pages><issn>1409-2433</issn><issn>2215-3373</issn><eissn>2215-3373</eissn><abstract>Deep neural network models (DGPs) can be represented hierarchically by a sequential composition of layers. When the prior distribution over the weights and biases are independently identically distributed, there is an equivalence with Gaussian processes (GP) in the limit of an infinite net[1]work width. DGPs are non-parametric statistical models used to character[1]ize patterns of complex non-linear systems due to their flexibility, greater generalization capacity, and a natural way of making inferences about the parameters and states of the system. This article proposes a hierarchi[1]cal Bayesian structure to model the weights and biases of a deep neural network. We deduce a general formula to calculate the integrals of Gaussian processes with non-linear transfer densities and obtain a kernel to estimate the covariance functions. In the methodology, we conduct an empirical study analyzing an electroencephalogram (EEG) database for diagnosing Alzheimer’s disease. Additionally, the DGPs models are esti[1]mated and compared with the NN models for 5, 10, 50, 100, 500, and 1000 neurons in the hidden layer, considering two transfer functions: Recti[1]fied Linear Unit (ReLU) and hyperbolic Tangent (Tanh). The results show good performance in the classification of the signals. Finally, we use the mean square error as a goodness of fit measure to validate the proposed models, obtaining low estimation errors.</abstract><pub>Centro de Investigaciones en Matemática Pura y Aplicada (CIMPA) y Escuela de Matemática, San José, Costa Rica</pub><doi>10.15517/rmta.v29i2.48885</doi><tpages>24</tpages><orcidid>https://orcid.org/0000-0001-7834-7790</orcidid><orcidid>https://orcid.org/0000-0002-8890-3911</orcidid><orcidid>https://orcid.org/0000-0002-2714-3517</orcidid><orcidid>https://orcid.org/0000-0001-8883-2730</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1409-2433
ispartof Revista de Matemática Teoría y Aplicaciones, 2022-12, Vol.29 (2), p.289-312
issn 1409-2433
2215-3373
2215-3373
language eng ; por
recordid cdi_dialnet_primary_oai_dialnet_unirioja_es_ART0001598926
source DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Dialnet
subjects Alzheimer disease
deep Gaussian process
electroencefalogramas
electroencephalogram
enfermedad de Alzheimer
Mathematics
Mathematics, Applied
procesos gausianos profundos
title Deep Gaussian processes and infinite neural networks for the analysis of EEG signals in Alzheimer’s diseases
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T06%3A20%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-scielo_dialn&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20Gaussian%20processes%20and%20infinite%20neural%20networks%20for%20the%20analysis%20of%20EEG%20signals%20in%20Alzheimer%E2%80%99s%20diseases&rft.jtitle=Revista%20de%20Matem%C3%A1tica%20Teor%C3%ADa%20y%20Aplicaciones&rft.au=Rom%C3%A1n,%20Krishna&rft.date=2022-12-01&rft.volume=29&rft.issue=2&rft.spage=289&rft.epage=312&rft.pages=289-312&rft.issn=1409-2433&rft.eissn=2215-3373&rft_id=info:doi/10.15517/rmta.v29i2.48885&rft_dat=%3Cscielo_dialn%3ES1409_24332022000200289%3C/scielo_dialn%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_scielo_id=S1409_24332022000200289&rfr_iscdi=true