A Fast Incremental Gaussian Mixture Model
This work builds upon previous efforts in online incremental learning, namely the Incremental Gaussian Mixture Network (IGMN). The IGMN is capable of learning from data streams in a single-pass by improving its model after analyzing each data point and discarding it thereafter. Nevertheless, it suff...
Gespeichert in:
Veröffentlicht in: | PloS one 2015-10, Vol.10 (10), p.e0139931-e0139931 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | e0139931 |
---|---|
container_issue | 10 |
container_start_page | e0139931 |
container_title | PloS one |
container_volume | 10 |
creator | Pinto, Rafael Coimbra Engel, Paulo Martins |
description | This work builds upon previous efforts in online incremental learning, namely the Incremental Gaussian Mixture Network (IGMN). The IGMN is capable of learning from data streams in a single-pass by improving its model after analyzing each data point and discarding it thereafter. Nevertheless, it suffers from the scalability point-of-view, due to its asymptotic time complexity of O(NKD3) for N data points, K Gaussian components and D dimensions, rendering it inadequate for high-dimensional data. In this work, we manage to reduce this complexity to O(NKD2) by deriving formulas for working directly with precision matrices instead of covariance matrices. The final result is a much faster and scalable algorithm which can be applied to high dimensional tasks. This is confirmed by applying the modified algorithm to high-dimensional classification datasets. |
doi_str_mv | 10.1371/journal.pone.0139931 |
format | Article |
fullrecord | <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_1720162957</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A430945957</galeid><doaj_id>oai_doaj_org_article_103f71f65b3b4c459e14cb2728ce99d2</doaj_id><sourcerecordid>A430945957</sourcerecordid><originalsourceid>FETCH-LOGICAL-c692t-f15ded69ece0bb78b7e67f74357bab472a71657e39c7e8cb277957f6f3c746ff3</originalsourceid><addsrcrecordid>eNqNkl1r2zAUhs3YWLtu_2BsgcFYL5JJlixZN4NQ1i7QUtjXrZDlo0RBtjLJHt2_n5y4JR69GLqQOHrOe3SO3ix7jdECE44_bn0fWuUWO9_CAmEiBMFPslMsSD5nOSJPj84n2YsYtwgVpGTseXaSM0ppWaLT7Hw5u1Sxm61aHaCBtlNudqX6GK1qZzf2rusDzG58De5l9swoF-HVuJ9lPy4_f7_4Mr--vVpdLK_nmom8mxtc1FAzARpQVfGy4sC44ZQUvFIV5bnimBUciNAcSl3lnIuCG2aI5pQZQ86ytwfdnfNRjl1GiXmOMMsTm4jVgai92spdsI0Kf6RXVu4DPqylCp3VDiRGxHBsWFGRimpaCMB0qJmXGoSo86T1aazWVw3UOk0gKDcRnd60diPX_rdMWozlOAl8GAWC_9VD7GRjowbnVAu-378bk4IyMqDv_kEf726k1io1YFvjU109iMolJUikwntq8QiVVg2N1ckSxqb4JOF8kpCYDu669fDXcvXt6_-ztz-n7PsjdgPKdZvoXd9Z38YpSA-gDj7GAOZhyBjJwdH305CDo-Xo6JT25viDHpLuLUz-AhCV7kg</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1720162957</pqid></control><display><type>article</type><title>A Fast Incremental Gaussian Mixture Model</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><source>Public Library of Science (PLoS)</source><creator>Pinto, Rafael Coimbra ; Engel, Paulo Martins</creator><creatorcontrib>Pinto, Rafael Coimbra ; Engel, Paulo Martins</creatorcontrib><description>This work builds upon previous efforts in online incremental learning, namely the Incremental Gaussian Mixture Network (IGMN). The IGMN is capable of learning from data streams in a single-pass by improving its model after analyzing each data point and discarding it thereafter. Nevertheless, it suffers from the scalability point-of-view, due to its asymptotic time complexity of O(NKD3) for N data points, K Gaussian components and D dimensions, rendering it inadequate for high-dimensional data. In this work, we manage to reduce this complexity to O(NKD2) by deriving formulas for working directly with precision matrices instead of covariance matrices. The final result is a much faster and scalable algorithm which can be applied to high dimensional tasks. This is confirmed by applying the modified algorithm to high-dimensional classification datasets.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0139931</identifier><identifier>PMID: 26444880</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Algorithms ; Analysis ; Artificial intelligence ; Complexity ; Covariance matrix ; Data points ; Data processing ; Data transmission ; Distance learning ; Gaussian processes ; Learning ; Machine Learning ; Models, Theoretical ; Neural networks ; Normal Distribution</subject><ispartof>PloS one, 2015-10, Vol.10 (10), p.e0139931-e0139931</ispartof><rights>COPYRIGHT 2015 Public Library of Science</rights><rights>2015 Pinto, Engel. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2015 Pinto, Engel 2015 Pinto, Engel</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c692t-f15ded69ece0bb78b7e67f74357bab472a71657e39c7e8cb277957f6f3c746ff3</citedby><cites>FETCH-LOGICAL-c692t-f15ded69ece0bb78b7e67f74357bab472a71657e39c7e8cb277957f6f3c746ff3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC4596621/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC4596621/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,2096,2915,23845,27901,27902,53766,53768,79342,79343</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/26444880$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Pinto, Rafael Coimbra</creatorcontrib><creatorcontrib>Engel, Paulo Martins</creatorcontrib><title>A Fast Incremental Gaussian Mixture Model</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>This work builds upon previous efforts in online incremental learning, namely the Incremental Gaussian Mixture Network (IGMN). The IGMN is capable of learning from data streams in a single-pass by improving its model after analyzing each data point and discarding it thereafter. Nevertheless, it suffers from the scalability point-of-view, due to its asymptotic time complexity of O(NKD3) for N data points, K Gaussian components and D dimensions, rendering it inadequate for high-dimensional data. In this work, we manage to reduce this complexity to O(NKD2) by deriving formulas for working directly with precision matrices instead of covariance matrices. The final result is a much faster and scalable algorithm which can be applied to high dimensional tasks. This is confirmed by applying the modified algorithm to high-dimensional classification datasets.</description><subject>Algorithms</subject><subject>Analysis</subject><subject>Artificial intelligence</subject><subject>Complexity</subject><subject>Covariance matrix</subject><subject>Data points</subject><subject>Data processing</subject><subject>Data transmission</subject><subject>Distance learning</subject><subject>Gaussian processes</subject><subject>Learning</subject><subject>Machine Learning</subject><subject>Models, Theoretical</subject><subject>Neural networks</subject><subject>Normal Distribution</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2015</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>BENPR</sourceid><sourceid>DOA</sourceid><recordid>eNqNkl1r2zAUhs3YWLtu_2BsgcFYL5JJlixZN4NQ1i7QUtjXrZDlo0RBtjLJHt2_n5y4JR69GLqQOHrOe3SO3ix7jdECE44_bn0fWuUWO9_CAmEiBMFPslMsSD5nOSJPj84n2YsYtwgVpGTseXaSM0ppWaLT7Hw5u1Sxm61aHaCBtlNudqX6GK1qZzf2rusDzG58De5l9swoF-HVuJ9lPy4_f7_4Mr--vVpdLK_nmom8mxtc1FAzARpQVfGy4sC44ZQUvFIV5bnimBUciNAcSl3lnIuCG2aI5pQZQ86ytwfdnfNRjl1GiXmOMMsTm4jVgai92spdsI0Kf6RXVu4DPqylCp3VDiRGxHBsWFGRimpaCMB0qJmXGoSo86T1aazWVw3UOk0gKDcRnd60diPX_rdMWozlOAl8GAWC_9VD7GRjowbnVAu-378bk4IyMqDv_kEf726k1io1YFvjU109iMolJUikwntq8QiVVg2N1ckSxqb4JOF8kpCYDu669fDXcvXt6_-ztz-n7PsjdgPKdZvoXd9Z38YpSA-gDj7GAOZhyBjJwdH305CDo-Xo6JT25viDHpLuLUz-AhCV7kg</recordid><startdate>20151007</startdate><enddate>20151007</enddate><creator>Pinto, Rafael Coimbra</creator><creator>Engel, Paulo Martins</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20151007</creationdate><title>A Fast Incremental Gaussian Mixture Model</title><author>Pinto, Rafael Coimbra ; Engel, Paulo Martins</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c692t-f15ded69ece0bb78b7e67f74357bab472a71657e39c7e8cb277957f6f3c746ff3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Algorithms</topic><topic>Analysis</topic><topic>Artificial intelligence</topic><topic>Complexity</topic><topic>Covariance matrix</topic><topic>Data points</topic><topic>Data processing</topic><topic>Data transmission</topic><topic>Distance learning</topic><topic>Gaussian processes</topic><topic>Learning</topic><topic>Machine Learning</topic><topic>Models, Theoretical</topic><topic>Neural networks</topic><topic>Normal Distribution</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Pinto, Rafael Coimbra</creatorcontrib><creatorcontrib>Engel, Paulo Martins</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing & Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>Agricultural & Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing & Allied Health Premium</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials Science Collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Pinto, Rafael Coimbra</au><au>Engel, Paulo Martins</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Fast Incremental Gaussian Mixture Model</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2015-10-07</date><risdate>2015</risdate><volume>10</volume><issue>10</issue><spage>e0139931</spage><epage>e0139931</epage><pages>e0139931-e0139931</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>This work builds upon previous efforts in online incremental learning, namely the Incremental Gaussian Mixture Network (IGMN). The IGMN is capable of learning from data streams in a single-pass by improving its model after analyzing each data point and discarding it thereafter. Nevertheless, it suffers from the scalability point-of-view, due to its asymptotic time complexity of O(NKD3) for N data points, K Gaussian components and D dimensions, rendering it inadequate for high-dimensional data. In this work, we manage to reduce this complexity to O(NKD2) by deriving formulas for working directly with precision matrices instead of covariance matrices. The final result is a much faster and scalable algorithm which can be applied to high dimensional tasks. This is confirmed by applying the modified algorithm to high-dimensional classification datasets.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>26444880</pmid><doi>10.1371/journal.pone.0139931</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1932-6203 |
ispartof | PloS one, 2015-10, Vol.10 (10), p.e0139931-e0139931 |
issn | 1932-6203 1932-6203 |
language | eng |
recordid | cdi_plos_journals_1720162957 |
source | MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; PubMed Central; Free Full-Text Journals in Chemistry; Public Library of Science (PLoS) |
subjects | Algorithms Analysis Artificial intelligence Complexity Covariance matrix Data points Data processing Data transmission Distance learning Gaussian processes Learning Machine Learning Models, Theoretical Neural networks Normal Distribution |
title | A Fast Incremental Gaussian Mixture Model |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T08%3A10%3A23IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Fast%20Incremental%20Gaussian%20Mixture%20Model&rft.jtitle=PloS%20one&rft.au=Pinto,%20Rafael%20Coimbra&rft.date=2015-10-07&rft.volume=10&rft.issue=10&rft.spage=e0139931&rft.epage=e0139931&rft.pages=e0139931-e0139931&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0139931&rft_dat=%3Cgale_plos_%3EA430945957%3C/gale_plos_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1720162957&rft_id=info:pmid/26444880&rft_galeid=A430945957&rft_doaj_id=oai_doaj_org_article_103f71f65b3b4c459e14cb2728ce99d2&rfr_iscdi=true |