Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems

Dimension reduction is a common approach for analyzing complex high-dimensional data and allows efficient implementation of classification and decision algorithms. Gaussian process latent variable model (GPLVM) is a widely applicable dimension reduction method which represents latent space without c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural computing & applications 2019-08, Vol.31 (8), p.3265-3278
Hauptverfasser: Heidari, Mahdi, Moattar, Mohammad Hossein
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 3278
container_issue 8
container_start_page 3265
container_title Neural computing & applications
container_volume 31
creator Heidari, Mahdi
Moattar, Mohammad Hossein
description Dimension reduction is a common approach for analyzing complex high-dimensional data and allows efficient implementation of classification and decision algorithms. Gaussian process latent variable model (GPLVM) is a widely applicable dimension reduction method which represents latent space without considering the class labels. Preserving the structure and topology of data are key factors that influence the performance of dimensionality reduction models. A conventional measure which reflects the topological structure of data points is geodesic distance. In this study, we propose an enriched GPLVM mapping between low-dimensional space and high-dimensional data. One of the contributions of the proposed approach is to calculate geodesic distance under the influence of class labels and introducing an improved GPLVM kernel using the distance. Also, the objective function of the model is reformulated to consider the trade-off between class separation and structure preservation which improves discrimination power and compactness of data. The efficiency of the proposed approach is compared with other dimension reduction techniques such as the kernel principal component analysis (KPCA), locally linear embedding (LLE), Laplacian eigenmaps and also discriminative and supervised extensions of standard GPLVM. Based on the experiments, it is suggested that the proposed model has a higher capacity for accurate classification and clustering of data as compared with the mentioned approaches.
doi_str_mv 10.1007/s00521-017-3273-4
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2285558249</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2285558249</sourcerecordid><originalsourceid>FETCH-LOGICAL-c316t-851a228774868099e197eb3e5701667cc64a908d797fe48da279d280173c42cf3</originalsourceid><addsrcrecordid>eNp1kM1OwzAQhC0EEqXwANwscQ74L7FzRAUKUiUucLZcZ1O5Sp3iTSrxILwvDkXixGm92m9m5CHkmrNbzpi-Q8ZKwQvGdSGFloU6ITOupCwkK80pmbFa5Wul5Dm5QNwyxlRlyhn5egjoU9iF6IZwALqBvgEMni7diBhcpPvUe0CknRsgDvTgUnDrDugugx1t-0RxSKMfxgSZBYR0CHFDm7CDiKGPNEGTz9MrROq7EQdIE-Fik1eXU9rg3Q-Qs7L1Di_JWes6hKvfOSfvT49vi-di9bp8WdyvCi95NRSm5E4Io7UylWF1DbzWsJZQasarSntfKVcz0-hat6BM44SuG2FySdIr4Vs5JzdH3xz8MQIOdtuPKeZIm33LsjRC1ZniR8qnHjFBa_e5MZc-LWd2at8e27fZ2E7tW5U14qjB_fRZSH_O_4u-AV20iv8</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2285558249</pqid></control><display><type>article</type><title>Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems</title><source>SpringerLink Journals - AutoHoldings</source><creator>Heidari, Mahdi ; Moattar, Mohammad Hossein</creator><creatorcontrib>Heidari, Mahdi ; Moattar, Mohammad Hossein</creatorcontrib><description>Dimension reduction is a common approach for analyzing complex high-dimensional data and allows efficient implementation of classification and decision algorithms. Gaussian process latent variable model (GPLVM) is a widely applicable dimension reduction method which represents latent space without considering the class labels. Preserving the structure and topology of data are key factors that influence the performance of dimensionality reduction models. A conventional measure which reflects the topological structure of data points is geodesic distance. In this study, we propose an enriched GPLVM mapping between low-dimensional space and high-dimensional data. One of the contributions of the proposed approach is to calculate geodesic distance under the influence of class labels and introducing an improved GPLVM kernel using the distance. Also, the objective function of the model is reformulated to consider the trade-off between class separation and structure preservation which improves discrimination power and compactness of data. The efficiency of the proposed approach is compared with other dimension reduction techniques such as the kernel principal component analysis (KPCA), locally linear embedding (LLE), Laplacian eigenmaps and also discriminative and supervised extensions of standard GPLVM. Based on the experiments, it is suggested that the proposed model has a higher capacity for accurate classification and clustering of data as compared with the mentioned approaches.</description><identifier>ISSN: 0941-0643</identifier><identifier>EISSN: 1433-3058</identifier><identifier>DOI: 10.1007/s00521-017-3273-4</identifier><language>eng</language><publisher>London: Springer London</publisher><subject>Algorithms ; Artificial Intelligence ; Classification ; Clustering ; Computational Biology/Bioinformatics ; Computational Science and Engineering ; Computer Science ; Data Mining and Knowledge Discovery ; Data points ; Dimensional analysis ; Gaussian process ; Image Processing and Computer Vision ; Kernels ; Labels ; Mapping ; Mathematical analysis ; Original Article ; Power efficiency ; Principal components analysis ; Probability and Statistics in Computer Science ; Reduction ; Topology</subject><ispartof>Neural computing &amp; applications, 2019-08, Vol.31 (8), p.3265-3278</ispartof><rights>The Natural Computing Applications Forum 2017</rights><rights>Neural Computing and Applications is a copyright of Springer, (2017). All Rights Reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c316t-851a228774868099e197eb3e5701667cc64a908d797fe48da279d280173c42cf3</citedby><cites>FETCH-LOGICAL-c316t-851a228774868099e197eb3e5701667cc64a908d797fe48da279d280173c42cf3</cites><orcidid>0000-0002-8968-6744</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s00521-017-3273-4$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s00521-017-3273-4$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Heidari, Mahdi</creatorcontrib><creatorcontrib>Moattar, Mohammad Hossein</creatorcontrib><title>Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems</title><title>Neural computing &amp; applications</title><addtitle>Neural Comput &amp; Applic</addtitle><description>Dimension reduction is a common approach for analyzing complex high-dimensional data and allows efficient implementation of classification and decision algorithms. Gaussian process latent variable model (GPLVM) is a widely applicable dimension reduction method which represents latent space without considering the class labels. Preserving the structure and topology of data are key factors that influence the performance of dimensionality reduction models. A conventional measure which reflects the topological structure of data points is geodesic distance. In this study, we propose an enriched GPLVM mapping between low-dimensional space and high-dimensional data. One of the contributions of the proposed approach is to calculate geodesic distance under the influence of class labels and introducing an improved GPLVM kernel using the distance. Also, the objective function of the model is reformulated to consider the trade-off between class separation and structure preservation which improves discrimination power and compactness of data. The efficiency of the proposed approach is compared with other dimension reduction techniques such as the kernel principal component analysis (KPCA), locally linear embedding (LLE), Laplacian eigenmaps and also discriminative and supervised extensions of standard GPLVM. Based on the experiments, it is suggested that the proposed model has a higher capacity for accurate classification and clustering of data as compared with the mentioned approaches.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Classification</subject><subject>Clustering</subject><subject>Computational Biology/Bioinformatics</subject><subject>Computational Science and Engineering</subject><subject>Computer Science</subject><subject>Data Mining and Knowledge Discovery</subject><subject>Data points</subject><subject>Dimensional analysis</subject><subject>Gaussian process</subject><subject>Image Processing and Computer Vision</subject><subject>Kernels</subject><subject>Labels</subject><subject>Mapping</subject><subject>Mathematical analysis</subject><subject>Original Article</subject><subject>Power efficiency</subject><subject>Principal components analysis</subject><subject>Probability and Statistics in Computer Science</subject><subject>Reduction</subject><subject>Topology</subject><issn>0941-0643</issn><issn>1433-3058</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp1kM1OwzAQhC0EEqXwANwscQ74L7FzRAUKUiUucLZcZ1O5Sp3iTSrxILwvDkXixGm92m9m5CHkmrNbzpi-Q8ZKwQvGdSGFloU6ITOupCwkK80pmbFa5Wul5Dm5QNwyxlRlyhn5egjoU9iF6IZwALqBvgEMni7diBhcpPvUe0CknRsgDvTgUnDrDugugx1t-0RxSKMfxgSZBYR0CHFDm7CDiKGPNEGTz9MrROq7EQdIE-Fik1eXU9rg3Q-Qs7L1Di_JWes6hKvfOSfvT49vi-di9bp8WdyvCi95NRSm5E4Io7UylWF1DbzWsJZQasarSntfKVcz0-hat6BM44SuG2FySdIr4Vs5JzdH3xz8MQIOdtuPKeZIm33LsjRC1ZniR8qnHjFBa_e5MZc-LWd2at8e27fZ2E7tW5U14qjB_fRZSH_O_4u-AV20iv8</recordid><startdate>20190801</startdate><enddate>20190801</enddate><creator>Heidari, Mahdi</creator><creator>Moattar, Mohammad Hossein</creator><general>Springer London</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><orcidid>https://orcid.org/0000-0002-8968-6744</orcidid></search><sort><creationdate>20190801</creationdate><title>Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems</title><author>Heidari, Mahdi ; Moattar, Mohammad Hossein</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c316t-851a228774868099e197eb3e5701667cc64a908d797fe48da279d280173c42cf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Classification</topic><topic>Clustering</topic><topic>Computational Biology/Bioinformatics</topic><topic>Computational Science and Engineering</topic><topic>Computer Science</topic><topic>Data Mining and Knowledge Discovery</topic><topic>Data points</topic><topic>Dimensional analysis</topic><topic>Gaussian process</topic><topic>Image Processing and Computer Vision</topic><topic>Kernels</topic><topic>Labels</topic><topic>Mapping</topic><topic>Mathematical analysis</topic><topic>Original Article</topic><topic>Power efficiency</topic><topic>Principal components analysis</topic><topic>Probability and Statistics in Computer Science</topic><topic>Reduction</topic><topic>Topology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Heidari, Mahdi</creatorcontrib><creatorcontrib>Moattar, Mohammad Hossein</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Neural computing &amp; applications</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Heidari, Mahdi</au><au>Moattar, Mohammad Hossein</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems</atitle><jtitle>Neural computing &amp; applications</jtitle><stitle>Neural Comput &amp; Applic</stitle><date>2019-08-01</date><risdate>2019</risdate><volume>31</volume><issue>8</issue><spage>3265</spage><epage>3278</epage><pages>3265-3278</pages><issn>0941-0643</issn><eissn>1433-3058</eissn><abstract>Dimension reduction is a common approach for analyzing complex high-dimensional data and allows efficient implementation of classification and decision algorithms. Gaussian process latent variable model (GPLVM) is a widely applicable dimension reduction method which represents latent space without considering the class labels. Preserving the structure and topology of data are key factors that influence the performance of dimensionality reduction models. A conventional measure which reflects the topological structure of data points is geodesic distance. In this study, we propose an enriched GPLVM mapping between low-dimensional space and high-dimensional data. One of the contributions of the proposed approach is to calculate geodesic distance under the influence of class labels and introducing an improved GPLVM kernel using the distance. Also, the objective function of the model is reformulated to consider the trade-off between class separation and structure preservation which improves discrimination power and compactness of data. The efficiency of the proposed approach is compared with other dimension reduction techniques such as the kernel principal component analysis (KPCA), locally linear embedding (LLE), Laplacian eigenmaps and also discriminative and supervised extensions of standard GPLVM. Based on the experiments, it is suggested that the proposed model has a higher capacity for accurate classification and clustering of data as compared with the mentioned approaches.</abstract><cop>London</cop><pub>Springer London</pub><doi>10.1007/s00521-017-3273-4</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0002-8968-6744</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0941-0643
ispartof Neural computing & applications, 2019-08, Vol.31 (8), p.3265-3278
issn 0941-0643
1433-3058
language eng
recordid cdi_proquest_journals_2285558249
source SpringerLink Journals - AutoHoldings
subjects Algorithms
Artificial Intelligence
Classification
Clustering
Computational Biology/Bioinformatics
Computational Science and Engineering
Computer Science
Data Mining and Knowledge Discovery
Data points
Dimensional analysis
Gaussian process
Image Processing and Computer Vision
Kernels
Labels
Mapping
Mathematical analysis
Original Article
Power efficiency
Principal components analysis
Probability and Statistics in Computer Science
Reduction
Topology
title Discriminative geodesic Gaussian process latent variable model for structure preserving dimension reduction in clustering and classification problems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T21%3A06%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Discriminative%20geodesic%20Gaussian%20process%20latent%20variable%20model%20for%20structure%20preserving%20dimension%20reduction%20in%20clustering%20and%20classification%20problems&rft.jtitle=Neural%20computing%20&%20applications&rft.au=Heidari,%20Mahdi&rft.date=2019-08-01&rft.volume=31&rft.issue=8&rft.spage=3265&rft.epage=3278&rft.pages=3265-3278&rft.issn=0941-0643&rft.eissn=1433-3058&rft_id=info:doi/10.1007/s00521-017-3273-4&rft_dat=%3Cproquest_cross%3E2285558249%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2285558249&rft_id=info:pmid/&rfr_iscdi=true