Regularized least squares locality preserving projections with applications to image recognition

Locality preserving projection (LPP), as a well-known technique for dimensionality reduction, is designed to preserve the local structure of the original samples which usually lie on a low-dimensional manifold in the real world. However, it suffers from the undersampled or small-sample-size problem,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2020-08, Vol.128, p.322-330
Hauptverfasser: Wei, Wei, Dai, Hua, Liang, Weitai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 330
container_issue
container_start_page 322
container_title Neural networks
container_volume 128
creator Wei, Wei
Dai, Hua
Liang, Weitai
description Locality preserving projection (LPP), as a well-known technique for dimensionality reduction, is designed to preserve the local structure of the original samples which usually lie on a low-dimensional manifold in the real world. However, it suffers from the undersampled or small-sample-size problem, when the dimension of the features is larger than the number of samples which causes the corresponding generalized eigenvalue problem to be ill-posed. To address this problem, we show that LPP is equivalent to a multivariate linear regression under a mild condition, and establish the connection between LPP and a least squares problem with multiple columns on the right-hand side. Based on the developed connection, we propose two regularized least squares methods for solving LPP. Experimental results on real-world databases illustrate the performance of our methods. •It is shown that LPP is equivalent to a multivariate linear regression under a mild condition.•A connection between LPP and a least squares problem is established.•Based on the developed connection, two regularized least squares methods for solving LPP are proposed.•The relationships among these two regularization methods and Laplacianface are analyzed.•Experimental results illustrate the outperformances of the proposed regularization methods.
doi_str_mv 10.1016/j.neunet.2020.05.023
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2408204574</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0893608020301933</els_id><sourcerecordid>2408204574</sourcerecordid><originalsourceid>FETCH-LOGICAL-c362t-9be2b6c9df440c5208115cfb833e0ac72127219512a277a2cb523c378e5e39a33</originalsourceid><addsrcrecordid>eNp9kE1r3DAQhkVoSLZp_kEpPvZidzSyLflSKEuTBhYCITmrsjy70eK1N5K8Zfvro8VpjzkM88E7Xw9jnzkUHHj9bVsMNA0UCwSEAqoCUJyxBVeyyVEq_MAWoBqR16Dgkn0MYQsAtSrFBbsUWEqQjVyw3w-0mXrj3V_qsp5MiFl4mYynkPWjNb2Lx2yfMvIHN2xSOG7JRjcOIfvj4nNm9vveWTNX4pi5ndlQ5smOm8Gdqp_Y-dr0ga7f_BV7uvn5uPyVr-5v75Y_VrkVNca8aQnb2jbduizBVgiK88quWyUEgbESOSZrKo4GpTRo2wqFFVJRRaIxQlyxr_PcdOLLRCHqnQuW-t4MNE5BYwkKoaxkmaTlLLV-DMHTWu99utsfNQd9Yqu3emarT2w1VDqxTW1f3jZM7Y66_03_YCbB91lA6c-DI6-DdTRY6lwCEnU3uvc3vAIlQY5T</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2408204574</pqid></control><display><type>article</type><title>Regularized least squares locality preserving projections with applications to image recognition</title><source>ScienceDirect Journals (5 years ago - present)</source><creator>Wei, Wei ; Dai, Hua ; Liang, Weitai</creator><creatorcontrib>Wei, Wei ; Dai, Hua ; Liang, Weitai</creatorcontrib><description>Locality preserving projection (LPP), as a well-known technique for dimensionality reduction, is designed to preserve the local structure of the original samples which usually lie on a low-dimensional manifold in the real world. However, it suffers from the undersampled or small-sample-size problem, when the dimension of the features is larger than the number of samples which causes the corresponding generalized eigenvalue problem to be ill-posed. To address this problem, we show that LPP is equivalent to a multivariate linear regression under a mild condition, and establish the connection between LPP and a least squares problem with multiple columns on the right-hand side. Based on the developed connection, we propose two regularized least squares methods for solving LPP. Experimental results on real-world databases illustrate the performance of our methods. •It is shown that LPP is equivalent to a multivariate linear regression under a mild condition.•A connection between LPP and a least squares problem is established.•Based on the developed connection, two regularized least squares methods for solving LPP are proposed.•The relationships among these two regularization methods and Laplacianface are analyzed.•Experimental results illustrate the outperformances of the proposed regularization methods.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2020.05.023</identifier><identifier>PMID: 32470797</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Dimensionality reduction ; Locality preserving projection ; Regularized least squares ; Small-sample-size problem</subject><ispartof>Neural networks, 2020-08, Vol.128, p.322-330</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright © 2020. Published by Elsevier Ltd.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c362t-9be2b6c9df440c5208115cfb833e0ac72127219512a277a2cb523c378e5e39a33</citedby><cites>FETCH-LOGICAL-c362t-9be2b6c9df440c5208115cfb833e0ac72127219512a277a2cb523c378e5e39a33</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2020.05.023$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3548,27923,27924,45994</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32470797$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wei, Wei</creatorcontrib><creatorcontrib>Dai, Hua</creatorcontrib><creatorcontrib>Liang, Weitai</creatorcontrib><title>Regularized least squares locality preserving projections with applications to image recognition</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>Locality preserving projection (LPP), as a well-known technique for dimensionality reduction, is designed to preserve the local structure of the original samples which usually lie on a low-dimensional manifold in the real world. However, it suffers from the undersampled or small-sample-size problem, when the dimension of the features is larger than the number of samples which causes the corresponding generalized eigenvalue problem to be ill-posed. To address this problem, we show that LPP is equivalent to a multivariate linear regression under a mild condition, and establish the connection between LPP and a least squares problem with multiple columns on the right-hand side. Based on the developed connection, we propose two regularized least squares methods for solving LPP. Experimental results on real-world databases illustrate the performance of our methods. •It is shown that LPP is equivalent to a multivariate linear regression under a mild condition.•A connection between LPP and a least squares problem is established.•Based on the developed connection, two regularized least squares methods for solving LPP are proposed.•The relationships among these two regularization methods and Laplacianface are analyzed.•Experimental results illustrate the outperformances of the proposed regularization methods.</description><subject>Dimensionality reduction</subject><subject>Locality preserving projection</subject><subject>Regularized least squares</subject><subject>Small-sample-size problem</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9kE1r3DAQhkVoSLZp_kEpPvZidzSyLflSKEuTBhYCITmrsjy70eK1N5K8Zfvro8VpjzkM88E7Xw9jnzkUHHj9bVsMNA0UCwSEAqoCUJyxBVeyyVEq_MAWoBqR16Dgkn0MYQsAtSrFBbsUWEqQjVyw3w-0mXrj3V_qsp5MiFl4mYynkPWjNb2Lx2yfMvIHN2xSOG7JRjcOIfvj4nNm9vveWTNX4pi5ndlQ5smOm8Gdqp_Y-dr0ga7f_BV7uvn5uPyVr-5v75Y_VrkVNca8aQnb2jbduizBVgiK88quWyUEgbESOSZrKo4GpTRo2wqFFVJRRaIxQlyxr_PcdOLLRCHqnQuW-t4MNE5BYwkKoaxkmaTlLLV-DMHTWu99utsfNQd9Yqu3emarT2w1VDqxTW1f3jZM7Y66_03_YCbB91lA6c-DI6-DdTRY6lwCEnU3uvc3vAIlQY5T</recordid><startdate>20200801</startdate><enddate>20200801</enddate><creator>Wei, Wei</creator><creator>Dai, Hua</creator><creator>Liang, Weitai</creator><general>Elsevier Ltd</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope></search><sort><creationdate>20200801</creationdate><title>Regularized least squares locality preserving projections with applications to image recognition</title><author>Wei, Wei ; Dai, Hua ; Liang, Weitai</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c362t-9be2b6c9df440c5208115cfb833e0ac72127219512a277a2cb523c378e5e39a33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Dimensionality reduction</topic><topic>Locality preserving projection</topic><topic>Regularized least squares</topic><topic>Small-sample-size problem</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wei, Wei</creatorcontrib><creatorcontrib>Dai, Hua</creatorcontrib><creatorcontrib>Liang, Weitai</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wei, Wei</au><au>Dai, Hua</au><au>Liang, Weitai</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Regularized least squares locality preserving projections with applications to image recognition</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2020-08-01</date><risdate>2020</risdate><volume>128</volume><spage>322</spage><epage>330</epage><pages>322-330</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>Locality preserving projection (LPP), as a well-known technique for dimensionality reduction, is designed to preserve the local structure of the original samples which usually lie on a low-dimensional manifold in the real world. However, it suffers from the undersampled or small-sample-size problem, when the dimension of the features is larger than the number of samples which causes the corresponding generalized eigenvalue problem to be ill-posed. To address this problem, we show that LPP is equivalent to a multivariate linear regression under a mild condition, and establish the connection between LPP and a least squares problem with multiple columns on the right-hand side. Based on the developed connection, we propose two regularized least squares methods for solving LPP. Experimental results on real-world databases illustrate the performance of our methods. •It is shown that LPP is equivalent to a multivariate linear regression under a mild condition.•A connection between LPP and a least squares problem is established.•Based on the developed connection, two regularized least squares methods for solving LPP are proposed.•The relationships among these two regularization methods and Laplacianface are analyzed.•Experimental results illustrate the outperformances of the proposed regularization methods.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>32470797</pmid><doi>10.1016/j.neunet.2020.05.023</doi><tpages>9</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2020-08, Vol.128, p.322-330
issn 0893-6080
1879-2782
language eng
recordid cdi_proquest_miscellaneous_2408204574
source ScienceDirect Journals (5 years ago - present)
subjects Dimensionality reduction
Locality preserving projection
Regularized least squares
Small-sample-size problem
title Regularized least squares locality preserving projections with applications to image recognition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T12%3A58%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Regularized%20least%20squares%20locality%20preserving%20projections%20with%20applications%20to%20image%20recognition&rft.jtitle=Neural%20networks&rft.au=Wei,%20Wei&rft.date=2020-08-01&rft.volume=128&rft.spage=322&rft.epage=330&rft.pages=322-330&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2020.05.023&rft_dat=%3Cproquest_cross%3E2408204574%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2408204574&rft_id=info:pmid/32470797&rft_els_id=S0893608020301933&rfr_iscdi=true