PRINCIPAL SUPPORT VECTOR MACHINES FOR LINEAR AND NONLINEAR SUFFICIENT DIMENSION REDUCTION

We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Annals of statistics 2011-12, Vol.39 (6), p.3182-3210
Hauptverfasser: Li, Bing, Artemiou, Andreas, Li, Lexin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 3210
container_issue 6
container_start_page 3182
container_title The Annals of statistics
container_volume 39
creator Li, Bing
Artemiou, Andreas
Li, Lexin
description We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separate them. These optimal hyperplanes are then aligned by the principal components of their normal vectors. It is proved that the aligned normal vectors provide an unbiased, $\sqrt n $ -consistent, and asymptotically normal estimator of the sufficient dimension reduction space. The method is then generalized to nonlinear sufficient dimension reduction using the reproducing kernel Hubert space. In that context, the aligned normal vectors become functions and it is proved that they are unbiased in the sense that they are functions of the true nonlinear sufficient predictors. We compare PSVM with other sufficient dimension reduction methods by simulation and in real data analysis, and through both comparisons firmly establish its practical advantages.
doi_str_mv 10.1214/11-aos932
format Article
fullrecord <record><control><sourceid>jstor_proje</sourceid><recordid>TN_cdi_projecteuclid_primary_oai_CULeuclid_euclid_aos_1330958677</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><jstor_id>41713612</jstor_id><sourcerecordid>41713612</sourcerecordid><originalsourceid>FETCH-LOGICAL-c466t-9adc0d576dbe9b91ba4f547996f019c9d707814fe4580c7180bb4e46cf074f03</originalsourceid><addsrcrecordid>eNo9kMtKw0AUhgdRsF4WPoAQBBcuonMyt8zOkKYaiElJUsHVMJkk0FKNZtqFb-9IQ1fn9p3_HH6EbgA_QgD0CcDXg5UkOEGzAHjoh5LzUzTDWGKfEU7P0YW1G4wxk5TM0MeyTPM4XUaZV62Wy6KsvfckrovSe4vi1zRPKm_hisxlUelF-dzLi3yqqtVikcZpktfePH1L8iotcq9M5qu4dtkVOuv11nbXU7xE9SKp41c_K17SOMp8Qznf-VK3BrdM8LbpZCOh0bRnVEjJewzSyFZgEQLtO8pCbASEuGloR7npsaA9Jpfo-SD7PQ6bzuy6vdmuW_U9rj_1-KsGvVbxKpu6U3AOKSAESxZyIZzE3VHiZ9_ZndoM-_HLPa2kpDyAQBAHPRwgMw7Wjl1_PAFY_VuvAFRUVM56x95Pgtoave1H_WXW9rgQMM4YAHXc7YHb2N0wHucUBBAOAfkDa7GFCQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>994621273</pqid></control><display><type>article</type><title>PRINCIPAL SUPPORT VECTOR MACHINES FOR LINEAR AND NONLINEAR SUFFICIENT DIMENSION REDUCTION</title><source>JSTOR Mathematics &amp; Statistics</source><source>JSTOR Archive Collection A-Z Listing</source><source>EZB-FREE-00999 freely available EZB journals</source><source>Project Euclid Complete</source><creator>Li, Bing ; Artemiou, Andreas ; Li, Lexin</creator><creatorcontrib>Li, Bing ; Artemiou, Andreas ; Li, Lexin</creatorcontrib><description>We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separate them. These optimal hyperplanes are then aligned by the principal components of their normal vectors. It is proved that the aligned normal vectors provide an unbiased, $\sqrt n $ -consistent, and asymptotically normal estimator of the sufficient dimension reduction space. The method is then generalized to nonlinear sufficient dimension reduction using the reproducing kernel Hubert space. In that context, the aligned normal vectors become functions and it is proved that they are unbiased in the sense that they are functions of the true nonlinear sufficient predictors. We compare PSVM with other sufficient dimension reduction methods by simulation and in real data analysis, and through both comparisons firmly establish its practical advantages.</description><identifier>ISSN: 0090-5364</identifier><identifier>EISSN: 2168-8966</identifier><identifier>DOI: 10.1214/11-aos932</identifier><identifier>CODEN: ASTSC7</identifier><language>eng</language><publisher>Cleveland, OH: Institute of Mathematical Statistics</publisher><subject>62-09 ; 62G08 ; 62H12 ; Algebra ; Contour regression ; Dimensional analysis ; Dimensionality reduction ; Estimators ; Exact sciences and technology ; Functional analysis ; General topics ; Hilbert spaces ; Hyperplanes ; invariant kernel ; inverse regression ; Linear regression ; Linear transformations ; Mathematical analysis ; Mathematical vectors ; Mathematics ; Multivariate analysis ; Nonlinear equations ; Objective functions ; principal components ; Probability and statistics ; reproducing kernel Hilbert space ; Sample size ; Sciences and techniques of general use ; Statistical analysis ; Statistics ; Studies ; support vector machine ; Support vector machines ; Vowels</subject><ispartof>The Annals of statistics, 2011-12, Vol.39 (6), p.3182-3210</ispartof><rights>Copyright © 2011 Institute of Mathematical Statistics</rights><rights>2015 INIST-CNRS</rights><rights>Copyright Institute of Mathematical Statistics Dec 2011</rights><rights>Copyright 2011 Institute of Mathematical Statistics</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c466t-9adc0d576dbe9b91ba4f547996f019c9d707814fe4580c7180bb4e46cf074f03</citedby><cites>FETCH-LOGICAL-c466t-9adc0d576dbe9b91ba4f547996f019c9d707814fe4580c7180bb4e46cf074f03</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/41713612$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/41713612$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>230,314,780,784,803,832,885,926,27924,27925,58017,58021,58250,58254</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=25655114$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Li, Bing</creatorcontrib><creatorcontrib>Artemiou, Andreas</creatorcontrib><creatorcontrib>Li, Lexin</creatorcontrib><title>PRINCIPAL SUPPORT VECTOR MACHINES FOR LINEAR AND NONLINEAR SUFFICIENT DIMENSION REDUCTION</title><title>The Annals of statistics</title><description>We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separate them. These optimal hyperplanes are then aligned by the principal components of their normal vectors. It is proved that the aligned normal vectors provide an unbiased, $\sqrt n $ -consistent, and asymptotically normal estimator of the sufficient dimension reduction space. The method is then generalized to nonlinear sufficient dimension reduction using the reproducing kernel Hubert space. In that context, the aligned normal vectors become functions and it is proved that they are unbiased in the sense that they are functions of the true nonlinear sufficient predictors. We compare PSVM with other sufficient dimension reduction methods by simulation and in real data analysis, and through both comparisons firmly establish its practical advantages.</description><subject>62-09</subject><subject>62G08</subject><subject>62H12</subject><subject>Algebra</subject><subject>Contour regression</subject><subject>Dimensional analysis</subject><subject>Dimensionality reduction</subject><subject>Estimators</subject><subject>Exact sciences and technology</subject><subject>Functional analysis</subject><subject>General topics</subject><subject>Hilbert spaces</subject><subject>Hyperplanes</subject><subject>invariant kernel</subject><subject>inverse regression</subject><subject>Linear regression</subject><subject>Linear transformations</subject><subject>Mathematical analysis</subject><subject>Mathematical vectors</subject><subject>Mathematics</subject><subject>Multivariate analysis</subject><subject>Nonlinear equations</subject><subject>Objective functions</subject><subject>principal components</subject><subject>Probability and statistics</subject><subject>reproducing kernel Hilbert space</subject><subject>Sample size</subject><subject>Sciences and techniques of general use</subject><subject>Statistical analysis</subject><subject>Statistics</subject><subject>Studies</subject><subject>support vector machine</subject><subject>Support vector machines</subject><subject>Vowels</subject><issn>0090-5364</issn><issn>2168-8966</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><recordid>eNo9kMtKw0AUhgdRsF4WPoAQBBcuonMyt8zOkKYaiElJUsHVMJkk0FKNZtqFb-9IQ1fn9p3_HH6EbgA_QgD0CcDXg5UkOEGzAHjoh5LzUzTDWGKfEU7P0YW1G4wxk5TM0MeyTPM4XUaZV62Wy6KsvfckrovSe4vi1zRPKm_hisxlUelF-dzLi3yqqtVikcZpktfePH1L8iotcq9M5qu4dtkVOuv11nbXU7xE9SKp41c_K17SOMp8Qznf-VK3BrdM8LbpZCOh0bRnVEjJewzSyFZgEQLtO8pCbASEuGloR7npsaA9Jpfo-SD7PQ6bzuy6vdmuW_U9rj_1-KsGvVbxKpu6U3AOKSAESxZyIZzE3VHiZ9_ZndoM-_HLPa2kpDyAQBAHPRwgMw7Wjl1_PAFY_VuvAFRUVM56x95Pgtoave1H_WXW9rgQMM4YAHXc7YHb2N0wHucUBBAOAfkDa7GFCQ</recordid><startdate>20111201</startdate><enddate>20111201</enddate><creator>Li, Bing</creator><creator>Artemiou, Andreas</creator><creator>Li, Lexin</creator><general>Institute of Mathematical Statistics</general><general>The Institute of Mathematical Statistics</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope></search><sort><creationdate>20111201</creationdate><title>PRINCIPAL SUPPORT VECTOR MACHINES FOR LINEAR AND NONLINEAR SUFFICIENT DIMENSION REDUCTION</title><author>Li, Bing ; Artemiou, Andreas ; Li, Lexin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c466t-9adc0d576dbe9b91ba4f547996f019c9d707814fe4580c7180bb4e46cf074f03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>62-09</topic><topic>62G08</topic><topic>62H12</topic><topic>Algebra</topic><topic>Contour regression</topic><topic>Dimensional analysis</topic><topic>Dimensionality reduction</topic><topic>Estimators</topic><topic>Exact sciences and technology</topic><topic>Functional analysis</topic><topic>General topics</topic><topic>Hilbert spaces</topic><topic>Hyperplanes</topic><topic>invariant kernel</topic><topic>inverse regression</topic><topic>Linear regression</topic><topic>Linear transformations</topic><topic>Mathematical analysis</topic><topic>Mathematical vectors</topic><topic>Mathematics</topic><topic>Multivariate analysis</topic><topic>Nonlinear equations</topic><topic>Objective functions</topic><topic>principal components</topic><topic>Probability and statistics</topic><topic>reproducing kernel Hilbert space</topic><topic>Sample size</topic><topic>Sciences and techniques of general use</topic><topic>Statistical analysis</topic><topic>Statistics</topic><topic>Studies</topic><topic>support vector machine</topic><topic>Support vector machines</topic><topic>Vowels</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Bing</creatorcontrib><creatorcontrib>Artemiou, Andreas</creatorcontrib><creatorcontrib>Li, Lexin</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>The Annals of statistics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Bing</au><au>Artemiou, Andreas</au><au>Li, Lexin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>PRINCIPAL SUPPORT VECTOR MACHINES FOR LINEAR AND NONLINEAR SUFFICIENT DIMENSION REDUCTION</atitle><jtitle>The Annals of statistics</jtitle><date>2011-12-01</date><risdate>2011</risdate><volume>39</volume><issue>6</issue><spage>3182</spage><epage>3210</epage><pages>3182-3210</pages><issn>0090-5364</issn><eissn>2168-8966</eissn><coden>ASTSC7</coden><abstract>We introduce a principal support vector machine (PSVM) approach that can be used for both linear and nonlinear sufficient dimension reduction. The basic idea is to divide the response variables into slices and use a modified form of support vector machine to find the optimal hyperplanes that separate them. These optimal hyperplanes are then aligned by the principal components of their normal vectors. It is proved that the aligned normal vectors provide an unbiased, $\sqrt n $ -consistent, and asymptotically normal estimator of the sufficient dimension reduction space. The method is then generalized to nonlinear sufficient dimension reduction using the reproducing kernel Hubert space. In that context, the aligned normal vectors become functions and it is proved that they are unbiased in the sense that they are functions of the true nonlinear sufficient predictors. We compare PSVM with other sufficient dimension reduction methods by simulation and in real data analysis, and through both comparisons firmly establish its practical advantages.</abstract><cop>Cleveland, OH</cop><pub>Institute of Mathematical Statistics</pub><doi>10.1214/11-aos932</doi><tpages>29</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0090-5364
ispartof The Annals of statistics, 2011-12, Vol.39 (6), p.3182-3210
issn 0090-5364
2168-8966
language eng
recordid cdi_projecteuclid_primary_oai_CULeuclid_euclid_aos_1330958677
source JSTOR Mathematics & Statistics; JSTOR Archive Collection A-Z Listing; EZB-FREE-00999 freely available EZB journals; Project Euclid Complete
subjects 62-09
62G08
62H12
Algebra
Contour regression
Dimensional analysis
Dimensionality reduction
Estimators
Exact sciences and technology
Functional analysis
General topics
Hilbert spaces
Hyperplanes
invariant kernel
inverse regression
Linear regression
Linear transformations
Mathematical analysis
Mathematical vectors
Mathematics
Multivariate analysis
Nonlinear equations
Objective functions
principal components
Probability and statistics
reproducing kernel Hilbert space
Sample size
Sciences and techniques of general use
Statistical analysis
Statistics
Studies
support vector machine
Support vector machines
Vowels
title PRINCIPAL SUPPORT VECTOR MACHINES FOR LINEAR AND NONLINEAR SUFFICIENT DIMENSION REDUCTION
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T11%3A40%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_proje&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=PRINCIPAL%20SUPPORT%20VECTOR%20MACHINES%20FOR%20LINEAR%20AND%20NONLINEAR%20SUFFICIENT%20DIMENSION%20REDUCTION&rft.jtitle=The%20Annals%20of%20statistics&rft.au=Li,%20Bing&rft.date=2011-12-01&rft.volume=39&rft.issue=6&rft.spage=3182&rft.epage=3210&rft.pages=3182-3210&rft.issn=0090-5364&rft.eissn=2168-8966&rft.coden=ASTSC7&rft_id=info:doi/10.1214/11-aos932&rft_dat=%3Cjstor_proje%3E41713612%3C/jstor_proje%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=994621273&rft_id=info:pmid/&rft_jstor_id=41713612&rfr_iscdi=true