Introduction to manifold learning

A popular research area today in statistics and machine learning is that of manifold learning, which is related to the algorithmic techniques of dimensionality reduction. Manifold learning can be divided into linear and nonlinear methods. Linear methods, which have long been part of the statistician...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Wiley interdisciplinary reviews. Computational statistics 2012-09, Vol.4 (5), p.439-446
1. Verfasser: Izenman, Alan Julian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 446
container_issue 5
container_start_page 439
container_title Wiley interdisciplinary reviews. Computational statistics
container_volume 4
creator Izenman, Alan Julian
description A popular research area today in statistics and machine learning is that of manifold learning, which is related to the algorithmic techniques of dimensionality reduction. Manifold learning can be divided into linear and nonlinear methods. Linear methods, which have long been part of the statistician's toolbox for analyzing multivariate data, include principal component analysis (PCA) and multidimensional scaling (MDS). Recently, there has been a flurry of research activity on nonlinear manifold learning, which includes Isomap, local linear embedding, Laplacian eigenmaps, Hessian eigenmaps, and diffusion maps. Some of these techniques are nonlinear generalizations of the linear methods. The algorithmic process of most of these techniques consists of three steps: a nearest‐neighbor search, a definition of distances or affinities between points (a key ingredient for the success of these methods), and an eigenproblem for embedding high‐dimensional points into a lower dimensional space. This article gives us a brief survey of these new methods and indicates their strengths and weaknesses. WIREs Comput Stat 2012 doi: 10.1002/wics.1222 This article is categorized under: Statistical and Graphical Methods of Data Analysis > Dimension Reduction Statistical Learning and Exploratory Methods of the Data Sciences > Manifold Learning Statistical and Graphical Methods of Data Analysis > Multivariate Analysis
doi_str_mv 10.1002/wics.1222
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2007960868</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2007960868</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3352-e6c8ef336b936d9e533f360e4a68a99f94c6b1cb1a0502eba6996005cb9a37b03</originalsourceid><addsrcrecordid>eNp1kMFOAjEQhhujiYgefAOMJw8L085ud3s0gEhC9IBKwqXplq4pLltslyBv75Il3swcZjL5_pnkI-SWQp8CsMHe6tCnjLEz0qECRQTAs_PTnFDILslVCOtmmzbVIXfTqvZutdO1dVWvdr2NqmzhylWvNMpXtvq8JheFKoO5OfUueX8avw2fo9nrZDp8nEUaMWGR4TozBSLPBfKVMAligRxMrHimhChErHlOdU4VJMBMrrgQHCDRuVCY5oBdct_e3Xr3vTOhlmu381XzUjKAtIEznjXUQ0tp70LwppBbbzfKHyQFeTQgjwbk0UDDDlp2b0tz-B-Ui-lwfkpEbcKG2vz8JZT_kjzFNJGLl4mcL0fLePExkjH-ApbRazE</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2007960868</pqid></control><display><type>article</type><title>Introduction to manifold learning</title><source>Wiley Online Library All Journals</source><creator>Izenman, Alan Julian</creator><creatorcontrib>Izenman, Alan Julian</creatorcontrib><description>A popular research area today in statistics and machine learning is that of manifold learning, which is related to the algorithmic techniques of dimensionality reduction. Manifold learning can be divided into linear and nonlinear methods. Linear methods, which have long been part of the statistician's toolbox for analyzing multivariate data, include principal component analysis (PCA) and multidimensional scaling (MDS). Recently, there has been a flurry of research activity on nonlinear manifold learning, which includes Isomap, local linear embedding, Laplacian eigenmaps, Hessian eigenmaps, and diffusion maps. Some of these techniques are nonlinear generalizations of the linear methods. The algorithmic process of most of these techniques consists of three steps: a nearest‐neighbor search, a definition of distances or affinities between points (a key ingredient for the success of these methods), and an eigenproblem for embedding high‐dimensional points into a lower dimensional space. This article gives us a brief survey of these new methods and indicates their strengths and weaknesses. WIREs Comput Stat 2012 doi: 10.1002/wics.1222 This article is categorized under: Statistical and Graphical Methods of Data Analysis &gt; Dimension Reduction Statistical Learning and Exploratory Methods of the Data Sciences &gt; Manifold Learning Statistical and Graphical Methods of Data Analysis &gt; Multivariate Analysis</description><identifier>ISSN: 1939-5108</identifier><identifier>EISSN: 1939-0068</identifier><identifier>DOI: 10.1002/wics.1222</identifier><language>eng</language><publisher>Hoboken, USA: John Wiley &amp; Sons, Inc</publisher><subject>Data ; Data analysis ; Data processing ; diffusion maps ; dimensionality reduction ; Dye dispersion ; Embedding ; Graphical methods ; Isomap ; Laplacian eigenmaps ; Learning algorithms ; local linear embedding ; Machine learning ; Manifolds ; Manifolds (mathematics) ; Multidimensional scaling ; Multivariate analysis ; Principal components analysis ; Reduction ; Scaling ; Statistical analysis ; Statistical methods ; Statistics ; Surveying</subject><ispartof>Wiley interdisciplinary reviews. Computational statistics, 2012-09, Vol.4 (5), p.439-446</ispartof><rights>Copyright © 2012 Wiley Periodicals, Inc.</rights><rights>Copyright Wiley Subscription Services, Inc. Sep/Oct 2012</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3352-e6c8ef336b936d9e533f360e4a68a99f94c6b1cb1a0502eba6996005cb9a37b03</citedby><cites>FETCH-LOGICAL-c3352-e6c8ef336b936d9e533f360e4a68a99f94c6b1cb1a0502eba6996005cb9a37b03</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fwics.1222$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fwics.1222$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1417,27924,27925,45574,45575</link.rule.ids></links><search><creatorcontrib>Izenman, Alan Julian</creatorcontrib><title>Introduction to manifold learning</title><title>Wiley interdisciplinary reviews. Computational statistics</title><addtitle>WIREs Comp Stat</addtitle><description>A popular research area today in statistics and machine learning is that of manifold learning, which is related to the algorithmic techniques of dimensionality reduction. Manifold learning can be divided into linear and nonlinear methods. Linear methods, which have long been part of the statistician's toolbox for analyzing multivariate data, include principal component analysis (PCA) and multidimensional scaling (MDS). Recently, there has been a flurry of research activity on nonlinear manifold learning, which includes Isomap, local linear embedding, Laplacian eigenmaps, Hessian eigenmaps, and diffusion maps. Some of these techniques are nonlinear generalizations of the linear methods. The algorithmic process of most of these techniques consists of three steps: a nearest‐neighbor search, a definition of distances or affinities between points (a key ingredient for the success of these methods), and an eigenproblem for embedding high‐dimensional points into a lower dimensional space. This article gives us a brief survey of these new methods and indicates their strengths and weaknesses. WIREs Comput Stat 2012 doi: 10.1002/wics.1222 This article is categorized under: Statistical and Graphical Methods of Data Analysis &gt; Dimension Reduction Statistical Learning and Exploratory Methods of the Data Sciences &gt; Manifold Learning Statistical and Graphical Methods of Data Analysis &gt; Multivariate Analysis</description><subject>Data</subject><subject>Data analysis</subject><subject>Data processing</subject><subject>diffusion maps</subject><subject>dimensionality reduction</subject><subject>Dye dispersion</subject><subject>Embedding</subject><subject>Graphical methods</subject><subject>Isomap</subject><subject>Laplacian eigenmaps</subject><subject>Learning algorithms</subject><subject>local linear embedding</subject><subject>Machine learning</subject><subject>Manifolds</subject><subject>Manifolds (mathematics)</subject><subject>Multidimensional scaling</subject><subject>Multivariate analysis</subject><subject>Principal components analysis</subject><subject>Reduction</subject><subject>Scaling</subject><subject>Statistical analysis</subject><subject>Statistical methods</subject><subject>Statistics</subject><subject>Surveying</subject><issn>1939-5108</issn><issn>1939-0068</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><recordid>eNp1kMFOAjEQhhujiYgefAOMJw8L085ud3s0gEhC9IBKwqXplq4pLltslyBv75Il3swcZjL5_pnkI-SWQp8CsMHe6tCnjLEz0qECRQTAs_PTnFDILslVCOtmmzbVIXfTqvZutdO1dVWvdr2NqmzhylWvNMpXtvq8JheFKoO5OfUueX8avw2fo9nrZDp8nEUaMWGR4TozBSLPBfKVMAligRxMrHimhChErHlOdU4VJMBMrrgQHCDRuVCY5oBdct_e3Xr3vTOhlmu381XzUjKAtIEznjXUQ0tp70LwppBbbzfKHyQFeTQgjwbk0UDDDlp2b0tz-B-Ui-lwfkpEbcKG2vz8JZT_kjzFNJGLl4mcL0fLePExkjH-ApbRazE</recordid><startdate>201209</startdate><enddate>201209</enddate><creator>Izenman, Alan Julian</creator><general>John Wiley &amp; Sons, Inc</general><general>Wiley Subscription Services, Inc</general><scope>BSCLL</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QH</scope><scope>7UA</scope><scope>C1K</scope><scope>F1W</scope><scope>H96</scope><scope>H97</scope><scope>JQ2</scope><scope>L.G</scope></search><sort><creationdate>201209</creationdate><title>Introduction to manifold learning</title><author>Izenman, Alan Julian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3352-e6c8ef336b936d9e533f360e4a68a99f94c6b1cb1a0502eba6996005cb9a37b03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Data</topic><topic>Data analysis</topic><topic>Data processing</topic><topic>diffusion maps</topic><topic>dimensionality reduction</topic><topic>Dye dispersion</topic><topic>Embedding</topic><topic>Graphical methods</topic><topic>Isomap</topic><topic>Laplacian eigenmaps</topic><topic>Learning algorithms</topic><topic>local linear embedding</topic><topic>Machine learning</topic><topic>Manifolds</topic><topic>Manifolds (mathematics)</topic><topic>Multidimensional scaling</topic><topic>Multivariate analysis</topic><topic>Principal components analysis</topic><topic>Reduction</topic><topic>Scaling</topic><topic>Statistical analysis</topic><topic>Statistical methods</topic><topic>Statistics</topic><topic>Surveying</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Izenman, Alan Julian</creatorcontrib><collection>Istex</collection><collection>CrossRef</collection><collection>Aqualine</collection><collection>Water Resources Abstracts</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 3: Aquatic Pollution &amp; Environmental Quality</collection><collection>ProQuest Computer Science Collection</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><jtitle>Wiley interdisciplinary reviews. Computational statistics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Izenman, Alan Julian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Introduction to manifold learning</atitle><jtitle>Wiley interdisciplinary reviews. Computational statistics</jtitle><addtitle>WIREs Comp Stat</addtitle><date>2012-09</date><risdate>2012</risdate><volume>4</volume><issue>5</issue><spage>439</spage><epage>446</epage><pages>439-446</pages><issn>1939-5108</issn><eissn>1939-0068</eissn><abstract>A popular research area today in statistics and machine learning is that of manifold learning, which is related to the algorithmic techniques of dimensionality reduction. Manifold learning can be divided into linear and nonlinear methods. Linear methods, which have long been part of the statistician's toolbox for analyzing multivariate data, include principal component analysis (PCA) and multidimensional scaling (MDS). Recently, there has been a flurry of research activity on nonlinear manifold learning, which includes Isomap, local linear embedding, Laplacian eigenmaps, Hessian eigenmaps, and diffusion maps. Some of these techniques are nonlinear generalizations of the linear methods. The algorithmic process of most of these techniques consists of three steps: a nearest‐neighbor search, a definition of distances or affinities between points (a key ingredient for the success of these methods), and an eigenproblem for embedding high‐dimensional points into a lower dimensional space. This article gives us a brief survey of these new methods and indicates their strengths and weaknesses. WIREs Comput Stat 2012 doi: 10.1002/wics.1222 This article is categorized under: Statistical and Graphical Methods of Data Analysis &gt; Dimension Reduction Statistical Learning and Exploratory Methods of the Data Sciences &gt; Manifold Learning Statistical and Graphical Methods of Data Analysis &gt; Multivariate Analysis</abstract><cop>Hoboken, USA</cop><pub>John Wiley &amp; Sons, Inc</pub><doi>10.1002/wics.1222</doi><tpages>8</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1939-5108
ispartof Wiley interdisciplinary reviews. Computational statistics, 2012-09, Vol.4 (5), p.439-446
issn 1939-5108
1939-0068
language eng
recordid cdi_proquest_journals_2007960868
source Wiley Online Library All Journals
subjects Data
Data analysis
Data processing
diffusion maps
dimensionality reduction
Dye dispersion
Embedding
Graphical methods
Isomap
Laplacian eigenmaps
Learning algorithms
local linear embedding
Machine learning
Manifolds
Manifolds (mathematics)
Multidimensional scaling
Multivariate analysis
Principal components analysis
Reduction
Scaling
Statistical analysis
Statistical methods
Statistics
Surveying
title Introduction to manifold learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T16%3A33%3A07IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Introduction%20to%20manifold%20learning&rft.jtitle=Wiley%20interdisciplinary%20reviews.%20Computational%20statistics&rft.au=Izenman,%20Alan%20Julian&rft.date=2012-09&rft.volume=4&rft.issue=5&rft.spage=439&rft.epage=446&rft.pages=439-446&rft.issn=1939-5108&rft.eissn=1939-0068&rft_id=info:doi/10.1002/wics.1222&rft_dat=%3Cproquest_cross%3E2007960868%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2007960868&rft_id=info:pmid/&rfr_iscdi=true