Self-Organizing MultiLayer Perceptron
In this paper, we propose an extension of a self-organizing map called self-organizing multilayer perceptron (SOMLP) whose purpose is to achieve quantization of spaces of functions. Based on the use of multilayer perceptron networks, SOMLP comprises the unsupervised as well as supervised learning al...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2010-11, Vol.21 (11), p.1766-1779 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1779 |
---|---|
container_issue | 11 |
container_start_page | 1766 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 21 |
creator | Gas, Bruno |
description | In this paper, we propose an extension of a self-organizing map called self-organizing multilayer perceptron (SOMLP) whose purpose is to achieve quantization of spaces of functions. Based on the use of multilayer perceptron networks, SOMLP comprises the unsupervised as well as supervised learning algorithms. We demonstrate that it is possible to use the commonly used vector quantization algorithms (LVQ algorithms) to build new algorithms called functional quantization algorithms (LFQ algorithms). The SOMLP can be used to model nonlinear and/or nonstationary complex dynamic processes, such as speech signals. While most of the functional data analysis (FDA) research is based on B-spline or similar univariate functions, the SOMLP algorithm allows quantization of function with high dimensional input space. As a consequence, classical FDA methods can be outperformed by increasing the dimensionality of the input space of the functions under analysis. Experiments on artificial and real world examples are presented which illustrate the potential of this approach. |
doi_str_mv | 10.1109/TNN.2010.2072790 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_762482613</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5580080</ieee_id><sourcerecordid>849469398</sourcerecordid><originalsourceid>FETCH-LOGICAL-c442t-2c1746fb78566689a6301686d55649409da0d38abf8afb7f4cc1e56d8ad329173</originalsourceid><addsrcrecordid>eNqF0ctKAzEUBuAgiq2XvSCIIEVcTD25TrIsRa1QW8G6DmkmU0emMzXpCPXpTWmt4MZVbl9OcvgROsPQxRjU7WQ06hKIKwIpSRXsoTZWDCcAiu7HOTCeKELSFjoK4R0AMw7iELUISC55qtqo8-LKPBn7mamKr6KaXT415bIYmpXzl8_OW7dY-ro6QQe5KYM73Y7H6PX-btIfJMPxw2O_N0wsY2SZEItTJvJpKrkQQiojKGAhRca5YIqBygxkVJppLk1UObMWOy4yaTJKFE7pMbrZ1H0zpV74Ym78Stem0IPeUK_3gDIcS8EnjvZ6Yxe-_mhcWOp5EawrS1O5uglaxheFokr-K1NBmCQC0yiv_sj3uvFVbFljiL0wyTiJCjbK-joE7_LdVzHodS465qLXuehtLvHKxbZwM527bHfhJ4gIOltggjVl7k1li_DrKKNEiHXX5xtXOOd2x5xLAAn0G_f8mXo</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1030148452</pqid></control><display><type>article</type><title>Self-Organizing MultiLayer Perceptron</title><source>IEEE Electronic Library (IEL)</source><creator>Gas, Bruno</creator><creatorcontrib>Gas, Bruno</creatorcontrib><description>In this paper, we propose an extension of a self-organizing map called self-organizing multilayer perceptron (SOMLP) whose purpose is to achieve quantization of spaces of functions. Based on the use of multilayer perceptron networks, SOMLP comprises the unsupervised as well as supervised learning algorithms. We demonstrate that it is possible to use the commonly used vector quantization algorithms (LVQ algorithms) to build new algorithms called functional quantization algorithms (LFQ algorithms). The SOMLP can be used to model nonlinear and/or nonstationary complex dynamic processes, such as speech signals. While most of the functional data analysis (FDA) research is based on B-spline or similar univariate functions, the SOMLP algorithm allows quantization of function with high dimensional input space. As a consequence, classical FDA methods can be outperformed by increasing the dimensionality of the input space of the functions under analysis. Experiments on artificial and real world examples are presented which illustrate the potential of this approach.</description><identifier>ISSN: 1045-9227</identifier><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 1941-0093</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNN.2010.2072790</identifier><identifier>PMID: 20858579</identifier><identifier>CODEN: ITNNEP</identifier><language>eng</language><publisher>New York, NY: IEEE</publisher><subject>Adaptation model ; Algorithm design and analysis ; Algorithms ; Applied sciences ; Approximation methods ; Artificial Intelligence ; Computer Science ; Computer science; control theory; systems ; Connectionism. Neural networks ; Construction ; Data processing ; Engineering Sciences ; Exact sciences and technology ; Functional data analysis ; Heuristic algorithms ; Machine Learning ; Mathematical Computing ; Mathematical models ; multilayer perceptron ; Multilayer perceptrons ; multivariate functions quantization ; Neural and Evolutionary Computing ; Neural networks ; Neural Networks (Computer) ; Neurons ; Nonlinear Dynamics ; Quantization ; Regression analysis ; Robotics ; self-organizing feature maps ; Signal and Image processing ; Signal Processing, Computer-Assisted ; Speech and sound recognition and synthesis. Linguistics ; speech processing ; Speech Recognition Software - standards ; Studies ; Vector quantization</subject><ispartof>IEEE transaction on neural networks and learning systems, 2010-11, Vol.21 (11), p.1766-1779</ispartof><rights>2015 INIST-CNRS</rights><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Nov 2010</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c442t-2c1746fb78566689a6301686d55649409da0d38abf8afb7f4cc1e56d8ad329173</citedby><cites>FETCH-LOGICAL-c442t-2c1746fb78566689a6301686d55649409da0d38abf8afb7f4cc1e56d8ad329173</cites><orcidid>0009-0000-8938-5565</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5580080$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>230,314,776,780,792,881,27903,27904,54736</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5580080$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=23432661$$DView record in Pascal Francis$$Hfree_for_read</backlink><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/20858579$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink><backlink>$$Uhttps://hal.science/hal-03416490$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Gas, Bruno</creatorcontrib><title>Self-Organizing MultiLayer Perceptron</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNN</addtitle><addtitle>IEEE Trans Neural Netw</addtitle><description>In this paper, we propose an extension of a self-organizing map called self-organizing multilayer perceptron (SOMLP) whose purpose is to achieve quantization of spaces of functions. Based on the use of multilayer perceptron networks, SOMLP comprises the unsupervised as well as supervised learning algorithms. We demonstrate that it is possible to use the commonly used vector quantization algorithms (LVQ algorithms) to build new algorithms called functional quantization algorithms (LFQ algorithms). The SOMLP can be used to model nonlinear and/or nonstationary complex dynamic processes, such as speech signals. While most of the functional data analysis (FDA) research is based on B-spline or similar univariate functions, the SOMLP algorithm allows quantization of function with high dimensional input space. As a consequence, classical FDA methods can be outperformed by increasing the dimensionality of the input space of the functions under analysis. Experiments on artificial and real world examples are presented which illustrate the potential of this approach.</description><subject>Adaptation model</subject><subject>Algorithm design and analysis</subject><subject>Algorithms</subject><subject>Applied sciences</subject><subject>Approximation methods</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Computer science; control theory; systems</subject><subject>Connectionism. Neural networks</subject><subject>Construction</subject><subject>Data processing</subject><subject>Engineering Sciences</subject><subject>Exact sciences and technology</subject><subject>Functional data analysis</subject><subject>Heuristic algorithms</subject><subject>Machine Learning</subject><subject>Mathematical Computing</subject><subject>Mathematical models</subject><subject>multilayer perceptron</subject><subject>Multilayer perceptrons</subject><subject>multivariate functions quantization</subject><subject>Neural and Evolutionary Computing</subject><subject>Neural networks</subject><subject>Neural Networks (Computer)</subject><subject>Neurons</subject><subject>Nonlinear Dynamics</subject><subject>Quantization</subject><subject>Regression analysis</subject><subject>Robotics</subject><subject>self-organizing feature maps</subject><subject>Signal and Image processing</subject><subject>Signal Processing, Computer-Assisted</subject><subject>Speech and sound recognition and synthesis. Linguistics</subject><subject>speech processing</subject><subject>Speech Recognition Software - standards</subject><subject>Studies</subject><subject>Vector quantization</subject><issn>1045-9227</issn><issn>2162-237X</issn><issn>1941-0093</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2010</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNqF0ctKAzEUBuAgiq2XvSCIIEVcTD25TrIsRa1QW8G6DmkmU0emMzXpCPXpTWmt4MZVbl9OcvgROsPQxRjU7WQ06hKIKwIpSRXsoTZWDCcAiu7HOTCeKELSFjoK4R0AMw7iELUISC55qtqo8-LKPBn7mamKr6KaXT415bIYmpXzl8_OW7dY-ro6QQe5KYM73Y7H6PX-btIfJMPxw2O_N0wsY2SZEItTJvJpKrkQQiojKGAhRca5YIqBygxkVJppLk1UObMWOy4yaTJKFE7pMbrZ1H0zpV74Ym78Stem0IPeUK_3gDIcS8EnjvZ6Yxe-_mhcWOp5EawrS1O5uglaxheFokr-K1NBmCQC0yiv_sj3uvFVbFljiL0wyTiJCjbK-joE7_LdVzHodS465qLXuehtLvHKxbZwM527bHfhJ4gIOltggjVl7k1li_DrKKNEiHXX5xtXOOd2x5xLAAn0G_f8mXo</recordid><startdate>20101101</startdate><enddate>20101101</enddate><creator>Gas, Bruno</creator><general>IEEE</general><general>Institute of Electrical and Electronics Engineers</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>IQODW</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope><scope>1XC</scope><orcidid>https://orcid.org/0009-0000-8938-5565</orcidid></search><sort><creationdate>20101101</creationdate><title>Self-Organizing MultiLayer Perceptron</title><author>Gas, Bruno</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c442t-2c1746fb78566689a6301686d55649409da0d38abf8afb7f4cc1e56d8ad329173</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Adaptation model</topic><topic>Algorithm design and analysis</topic><topic>Algorithms</topic><topic>Applied sciences</topic><topic>Approximation methods</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Computer science; control theory; systems</topic><topic>Connectionism. Neural networks</topic><topic>Construction</topic><topic>Data processing</topic><topic>Engineering Sciences</topic><topic>Exact sciences and technology</topic><topic>Functional data analysis</topic><topic>Heuristic algorithms</topic><topic>Machine Learning</topic><topic>Mathematical Computing</topic><topic>Mathematical models</topic><topic>multilayer perceptron</topic><topic>Multilayer perceptrons</topic><topic>multivariate functions quantization</topic><topic>Neural and Evolutionary Computing</topic><topic>Neural networks</topic><topic>Neural Networks (Computer)</topic><topic>Neurons</topic><topic>Nonlinear Dynamics</topic><topic>Quantization</topic><topic>Regression analysis</topic><topic>Robotics</topic><topic>self-organizing feature maps</topic><topic>Signal and Image processing</topic><topic>Signal Processing, Computer-Assisted</topic><topic>Speech and sound recognition and synthesis. Linguistics</topic><topic>speech processing</topic><topic>Speech Recognition Software - standards</topic><topic>Studies</topic><topic>Vector quantization</topic><toplevel>online_resources</toplevel><creatorcontrib>Gas, Bruno</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Pascal-Francis</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><collection>Hyper Article en Ligne (HAL)</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Gas, Bruno</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Self-Organizing MultiLayer Perceptron</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNN</stitle><addtitle>IEEE Trans Neural Netw</addtitle><date>2010-11-01</date><risdate>2010</risdate><volume>21</volume><issue>11</issue><spage>1766</spage><epage>1779</epage><pages>1766-1779</pages><issn>1045-9227</issn><issn>2162-237X</issn><eissn>1941-0093</eissn><eissn>2162-2388</eissn><coden>ITNNEP</coden><abstract>In this paper, we propose an extension of a self-organizing map called self-organizing multilayer perceptron (SOMLP) whose purpose is to achieve quantization of spaces of functions. Based on the use of multilayer perceptron networks, SOMLP comprises the unsupervised as well as supervised learning algorithms. We demonstrate that it is possible to use the commonly used vector quantization algorithms (LVQ algorithms) to build new algorithms called functional quantization algorithms (LFQ algorithms). The SOMLP can be used to model nonlinear and/or nonstationary complex dynamic processes, such as speech signals. While most of the functional data analysis (FDA) research is based on B-spline or similar univariate functions, the SOMLP algorithm allows quantization of function with high dimensional input space. As a consequence, classical FDA methods can be outperformed by increasing the dimensionality of the input space of the functions under analysis. Experiments on artificial and real world examples are presented which illustrate the potential of this approach.</abstract><cop>New York, NY</cop><pub>IEEE</pub><pmid>20858579</pmid><doi>10.1109/TNN.2010.2072790</doi><tpages>14</tpages><orcidid>https://orcid.org/0009-0000-8938-5565</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1045-9227 |
ispartof | IEEE transaction on neural networks and learning systems, 2010-11, Vol.21 (11), p.1766-1779 |
issn | 1045-9227 2162-237X 1941-0093 2162-2388 |
language | eng |
recordid | cdi_proquest_miscellaneous_762482613 |
source | IEEE Electronic Library (IEL) |
subjects | Adaptation model Algorithm design and analysis Algorithms Applied sciences Approximation methods Artificial Intelligence Computer Science Computer science control theory systems Connectionism. Neural networks Construction Data processing Engineering Sciences Exact sciences and technology Functional data analysis Heuristic algorithms Machine Learning Mathematical Computing Mathematical models multilayer perceptron Multilayer perceptrons multivariate functions quantization Neural and Evolutionary Computing Neural networks Neural Networks (Computer) Neurons Nonlinear Dynamics Quantization Regression analysis Robotics self-organizing feature maps Signal and Image processing Signal Processing, Computer-Assisted Speech and sound recognition and synthesis. Linguistics speech processing Speech Recognition Software - standards Studies Vector quantization |
title | Self-Organizing MultiLayer Perceptron |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T06%3A36%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Self-Organizing%20MultiLayer%20Perceptron&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Gas,%20Bruno&rft.date=2010-11-01&rft.volume=21&rft.issue=11&rft.spage=1766&rft.epage=1779&rft.pages=1766-1779&rft.issn=1045-9227&rft.eissn=1941-0093&rft.coden=ITNNEP&rft_id=info:doi/10.1109/TNN.2010.2072790&rft_dat=%3Cproquest_RIE%3E849469398%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1030148452&rft_id=info:pmid/20858579&rft_ieee_id=5580080&rfr_iscdi=true |