Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection

Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. Howe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Si-Bao Chen, Hai-Xian Wang, Xing-Yi Zhang, Bin Luo
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1235
container_issue
container_start_page 1232
container_title
container_volume 2
creator Si-Bao Chen
Hai-Xian Wang
Xing-Yi Zhang
Bin Luo
description Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. However, in many applications, labeled data are very limited while unlabeled data can be easily obtained. The estimation of divergences of class pairs is unstable using inadequate labeled data. To take advantage of unlabeled data for subspace selection, semi-supervised MGMD (SSMGMD) is proposed using graph Laplacian as normalization. Quasi-Newton method is adopted to solve the optimization problem. Experiments on synthetic data and real image data show the validity of SSMGMD.
doi_str_mv 10.1109/FSKD.2011.6019712
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_6019712</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6019712</ieee_id><sourcerecordid>6019712</sourcerecordid><originalsourceid>FETCH-LOGICAL-i90t-40893bf47f9f0f0260b8ba3846febf546397072959951ab95157aba7d8da263e3</originalsourceid><addsrcrecordid>eNpVkL1OwzAUhY0QEqjkARCLXyDBjhPbd0SFUtRIDC1zZSfXlSF_spNKvD2R6MIZzqdvOcMh5IGzjHMGT5v97iXLGeeZZBwUz69IAkpzyXNd8IXX_5zBLUli_GJLpASh9B353GPn0ziPGM4-YkNPOHQ4BV_TDk1PB0d3c9taU3-nFXrbYqCNP2M4YV9jpG4INM42jqZGGrHFevJDf09unGkjJheuyGHzelhv0-rj7X39XKUe2JQWTIOwrlAOHHMsl8xqa4QupEPrykIKUEzlUAKU3NilSmWsUY1uTC4FihV5_Jv1iHgcg-9M-DlevhC_5_1Smw</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Si-Bao Chen ; Hai-Xian Wang ; Xing-Yi Zhang ; Bin Luo</creator><creatorcontrib>Si-Bao Chen ; Hai-Xian Wang ; Xing-Yi Zhang ; Bin Luo</creatorcontrib><description>Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. However, in many applications, labeled data are very limited while unlabeled data can be easily obtained. The estimation of divergences of class pairs is unstable using inadequate labeled data. To take advantage of unlabeled data for subspace selection, semi-supervised MGMD (SSMGMD) is proposed using graph Laplacian as normalization. Quasi-Newton method is adopted to solve the optimization problem. Experiments on synthetic data and real image data show the validity of SSMGMD.</description><identifier>ISBN: 9781612841809</identifier><identifier>ISBN: 1612841805</identifier><identifier>EISBN: 9781612841816</identifier><identifier>EISBN: 1612841813</identifier><identifier>EISBN: 1612841791</identifier><identifier>EISBN: 9781612841793</identifier><identifier>DOI: 10.1109/FSKD.2011.6019712</identifier><language>eng</language><publisher>IEEE</publisher><subject>Covariance matrix ; Educational institutions ; Laplace equations ; Manifolds ; Optimization ; Symmetric matrices ; Training</subject><ispartof>2011 Eighth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), 2011, Vol.2, p.1232-1235</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6019712$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,27904,54898</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6019712$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Si-Bao Chen</creatorcontrib><creatorcontrib>Hai-Xian Wang</creatorcontrib><creatorcontrib>Xing-Yi Zhang</creatorcontrib><creatorcontrib>Bin Luo</creatorcontrib><title>Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection</title><title>2011 Eighth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD)</title><addtitle>FSKD</addtitle><description>Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. However, in many applications, labeled data are very limited while unlabeled data can be easily obtained. The estimation of divergences of class pairs is unstable using inadequate labeled data. To take advantage of unlabeled data for subspace selection, semi-supervised MGMD (SSMGMD) is proposed using graph Laplacian as normalization. Quasi-Newton method is adopted to solve the optimization problem. Experiments on synthetic data and real image data show the validity of SSMGMD.</description><subject>Covariance matrix</subject><subject>Educational institutions</subject><subject>Laplace equations</subject><subject>Manifolds</subject><subject>Optimization</subject><subject>Symmetric matrices</subject><subject>Training</subject><isbn>9781612841809</isbn><isbn>1612841805</isbn><isbn>9781612841816</isbn><isbn>1612841813</isbn><isbn>1612841791</isbn><isbn>9781612841793</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2011</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNpVkL1OwzAUhY0QEqjkARCLXyDBjhPbd0SFUtRIDC1zZSfXlSF_spNKvD2R6MIZzqdvOcMh5IGzjHMGT5v97iXLGeeZZBwUz69IAkpzyXNd8IXX_5zBLUli_GJLpASh9B353GPn0ziPGM4-YkNPOHQ4BV_TDk1PB0d3c9taU3-nFXrbYqCNP2M4YV9jpG4INM42jqZGGrHFevJDf09unGkjJheuyGHzelhv0-rj7X39XKUe2JQWTIOwrlAOHHMsl8xqa4QupEPrykIKUEzlUAKU3NilSmWsUY1uTC4FihV5_Jv1iHgcg-9M-DlevhC_5_1Smw</recordid><startdate>201107</startdate><enddate>201107</enddate><creator>Si-Bao Chen</creator><creator>Hai-Xian Wang</creator><creator>Xing-Yi Zhang</creator><creator>Bin Luo</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>201107</creationdate><title>Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection</title><author>Si-Bao Chen ; Hai-Xian Wang ; Xing-Yi Zhang ; Bin Luo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i90t-40893bf47f9f0f0260b8ba3846febf546397072959951ab95157aba7d8da263e3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Covariance matrix</topic><topic>Educational institutions</topic><topic>Laplace equations</topic><topic>Manifolds</topic><topic>Optimization</topic><topic>Symmetric matrices</topic><topic>Training</topic><toplevel>online_resources</toplevel><creatorcontrib>Si-Bao Chen</creatorcontrib><creatorcontrib>Hai-Xian Wang</creatorcontrib><creatorcontrib>Xing-Yi Zhang</creatorcontrib><creatorcontrib>Bin Luo</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Si-Bao Chen</au><au>Hai-Xian Wang</au><au>Xing-Yi Zhang</au><au>Bin Luo</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection</atitle><btitle>2011 Eighth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD)</btitle><stitle>FSKD</stitle><date>2011-07</date><risdate>2011</risdate><volume>2</volume><spage>1232</spage><epage>1235</epage><pages>1232-1235</pages><isbn>9781612841809</isbn><isbn>1612841805</isbn><eisbn>9781612841816</eisbn><eisbn>1612841813</eisbn><eisbn>1612841791</eisbn><eisbn>9781612841793</eisbn><abstract>Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. However, in many applications, labeled data are very limited while unlabeled data can be easily obtained. The estimation of divergences of class pairs is unstable using inadequate labeled data. To take advantage of unlabeled data for subspace selection, semi-supervised MGMD (SSMGMD) is proposed using graph Laplacian as normalization. Quasi-Newton method is adopted to solve the optimization problem. Experiments on synthetic data and real image data show the validity of SSMGMD.</abstract><pub>IEEE</pub><doi>10.1109/FSKD.2011.6019712</doi><tpages>4</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 9781612841809
ispartof 2011 Eighth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), 2011, Vol.2, p.1232-1235
issn
language eng
recordid cdi_ieee_primary_6019712
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Covariance matrix
Educational institutions
Laplace equations
Manifolds
Optimization
Symmetric matrices
Training
title Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T04%3A29%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Semi-supervised%20geometric%20mean%20of%20Kullback-Leibler%20divergences%20for%20subspace%20selection&rft.btitle=2011%20Eighth%20International%20Conference%20on%20Fuzzy%20Systems%20and%20Knowledge%20Discovery%20(FSKD)&rft.au=Si-Bao%20Chen&rft.date=2011-07&rft.volume=2&rft.spage=1232&rft.epage=1235&rft.pages=1232-1235&rft.isbn=9781612841809&rft.isbn_list=1612841805&rft_id=info:doi/10.1109/FSKD.2011.6019712&rft_dat=%3Cieee_6IE%3E6019712%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781612841816&rft.eisbn_list=1612841813&rft.eisbn_list=1612841791&rft.eisbn_list=9781612841793&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6019712&rfr_iscdi=true