Survey: Geometric Foundations of Data Reduction

This survey is written in summer, 2016. The purpose of this survey is to briefly introduce nonlinear dimensionality reduction (NLDR) in data reduction. The first two NLDR were respectively published in Science in 2000 in which they solve the similar reduction problem of high-dimensional data endowed...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2022-03
1. Verfasser: Ju, Ce
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Ju, Ce
description This survey is written in summer, 2016. The purpose of this survey is to briefly introduce nonlinear dimensionality reduction (NLDR) in data reduction. The first two NLDR were respectively published in Science in 2000 in which they solve the similar reduction problem of high-dimensional data endowed with the intrinsic nonlinear structure. The intrinsic nonlinear structure is always interpreted as a concept in manifolds from geometry and topology in theoretical mathematics by computer scientists and theoretical physicists. In 2001, the concept of Manifold Learning first appears as an NLDR method called Laplacian Eigenmaps. In a typical manifold learning setup, the data set, also called the observation set, is distributed on or near a low dimensional manifold M embedded in RD, which yields that each observation has a D-dimensional representation. The goal of manifold learning is to reduce these observations as a compact lower-dimensional representation based on the geometric information. The reduction procedure is called the spectral manifold learning. In this paper, we derive each spectral manifold learning with the matrix and operator representation, and we then discuss the convergence behavior of each method in a geometric uniform language. Hence, the survey is named Geometric Foundations of Data Reduction.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2435007032</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2435007032</sourcerecordid><originalsourceid>FETCH-proquest_journals_24350070323</originalsourceid><addsrcrecordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mTQDy4tKkuttFJwT83PTS0pykxWcMsvzUtJLMnMzytWyE9TcEksSVQISk0pTQYJ8TCwpiXmFKfyQmluBmU31xBnD92CovzC0tTikvis_NKiPKBUvJGJsamBgbmBsZExcaoAHaQydQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2435007032</pqid></control><display><type>article</type><title>Survey: Geometric Foundations of Data Reduction</title><source>Free E- Journals</source><creator>Ju, Ce</creator><creatorcontrib>Ju, Ce</creatorcontrib><description>This survey is written in summer, 2016. The purpose of this survey is to briefly introduce nonlinear dimensionality reduction (NLDR) in data reduction. The first two NLDR were respectively published in Science in 2000 in which they solve the similar reduction problem of high-dimensional data endowed with the intrinsic nonlinear structure. The intrinsic nonlinear structure is always interpreted as a concept in manifolds from geometry and topology in theoretical mathematics by computer scientists and theoretical physicists. In 2001, the concept of Manifold Learning first appears as an NLDR method called Laplacian Eigenmaps. In a typical manifold learning setup, the data set, also called the observation set, is distributed on or near a low dimensional manifold M embedded in RD, which yields that each observation has a D-dimensional representation. The goal of manifold learning is to reduce these observations as a compact lower-dimensional representation based on the geometric information. The reduction procedure is called the spectral manifold learning. In this paper, we derive each spectral manifold learning with the matrix and operator representation, and we then discuss the convergence behavior of each method in a geometric uniform language. Hence, the survey is named Geometric Foundations of Data Reduction.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Data reduction ; Foundations ; Machine learning ; Manifolds (mathematics) ; Mathematical analysis ; Matrix methods ; Operators (mathematics) ; Physicists ; Representations ; Spectra ; Teaching methods ; Topology</subject><ispartof>arXiv.org, 2022-03</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Ju, Ce</creatorcontrib><title>Survey: Geometric Foundations of Data Reduction</title><title>arXiv.org</title><description>This survey is written in summer, 2016. The purpose of this survey is to briefly introduce nonlinear dimensionality reduction (NLDR) in data reduction. The first two NLDR were respectively published in Science in 2000 in which they solve the similar reduction problem of high-dimensional data endowed with the intrinsic nonlinear structure. The intrinsic nonlinear structure is always interpreted as a concept in manifolds from geometry and topology in theoretical mathematics by computer scientists and theoretical physicists. In 2001, the concept of Manifold Learning first appears as an NLDR method called Laplacian Eigenmaps. In a typical manifold learning setup, the data set, also called the observation set, is distributed on or near a low dimensional manifold M embedded in RD, which yields that each observation has a D-dimensional representation. The goal of manifold learning is to reduce these observations as a compact lower-dimensional representation based on the geometric information. The reduction procedure is called the spectral manifold learning. In this paper, we derive each spectral manifold learning with the matrix and operator representation, and we then discuss the convergence behavior of each method in a geometric uniform language. Hence, the survey is named Geometric Foundations of Data Reduction.</description><subject>Data reduction</subject><subject>Foundations</subject><subject>Machine learning</subject><subject>Manifolds (mathematics)</subject><subject>Mathematical analysis</subject><subject>Matrix methods</subject><subject>Operators (mathematics)</subject><subject>Physicists</subject><subject>Representations</subject><subject>Spectra</subject><subject>Teaching methods</subject><subject>Topology</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNpjYuA0MjY21LUwMTLiYOAtLs4yMDAwMjM3MjU15mTQDy4tKkuttFJwT83PTS0pykxWcMsvzUtJLMnMzytWyE9TcEksSVQISk0pTQYJ8TCwpiXmFKfyQmluBmU31xBnD92CovzC0tTikvis_NKiPKBUvJGJsamBgbmBsZExcaoAHaQydQ</recordid><startdate>20220320</startdate><enddate>20220320</enddate><creator>Ju, Ce</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20220320</creationdate><title>Survey: Geometric Foundations of Data Reduction</title><author>Ju, Ce</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_24350070323</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Data reduction</topic><topic>Foundations</topic><topic>Machine learning</topic><topic>Manifolds (mathematics)</topic><topic>Mathematical analysis</topic><topic>Matrix methods</topic><topic>Operators (mathematics)</topic><topic>Physicists</topic><topic>Representations</topic><topic>Spectra</topic><topic>Teaching methods</topic><topic>Topology</topic><toplevel>online_resources</toplevel><creatorcontrib>Ju, Ce</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ju, Ce</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Survey: Geometric Foundations of Data Reduction</atitle><jtitle>arXiv.org</jtitle><date>2022-03-20</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>This survey is written in summer, 2016. The purpose of this survey is to briefly introduce nonlinear dimensionality reduction (NLDR) in data reduction. The first two NLDR were respectively published in Science in 2000 in which they solve the similar reduction problem of high-dimensional data endowed with the intrinsic nonlinear structure. The intrinsic nonlinear structure is always interpreted as a concept in manifolds from geometry and topology in theoretical mathematics by computer scientists and theoretical physicists. In 2001, the concept of Manifold Learning first appears as an NLDR method called Laplacian Eigenmaps. In a typical manifold learning setup, the data set, also called the observation set, is distributed on or near a low dimensional manifold M embedded in RD, which yields that each observation has a D-dimensional representation. The goal of manifold learning is to reduce these observations as a compact lower-dimensional representation based on the geometric information. The reduction procedure is called the spectral manifold learning. In this paper, we derive each spectral manifold learning with the matrix and operator representation, and we then discuss the convergence behavior of each method in a geometric uniform language. Hence, the survey is named Geometric Foundations of Data Reduction.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2022-03
issn 2331-8422
language eng
recordid cdi_proquest_journals_2435007032
source Free E- Journals
subjects Data reduction
Foundations
Machine learning
Manifolds (mathematics)
Mathematical analysis
Matrix methods
Operators (mathematics)
Physicists
Representations
Spectra
Teaching methods
Topology
title Survey: Geometric Foundations of Data Reduction
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T14%3A03%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Survey:%20Geometric%20Foundations%20of%20Data%20Reduction&rft.jtitle=arXiv.org&rft.au=Ju,%20Ce&rft.date=2022-03-20&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2435007032%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2435007032&rft_id=info:pmid/&rfr_iscdi=true