Structure learning via unstructured kernel-based M-regression

In statistical learning, identifying underlying structures of true target functions based on observed data plays a crucial role to facilitate subsequent modeling and analysis. Unlike most of those existing methods that focus on some specific settings under certain model assumptions, this paper propo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: He, Xin, Ge, Yeheng, Feng, Xingdong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator He, Xin
Ge, Yeheng
Feng, Xingdong
description In statistical learning, identifying underlying structures of true target functions based on observed data plays a crucial role to facilitate subsequent modeling and analysis. Unlike most of those existing methods that focus on some specific settings under certain model assumptions, this paper proposes a general and novel framework for recovering true structures of target functions by using unstructured M-regression in a reproducing kernel Hilbert space (RKHS). The proposed framework is inspired by the fact that gradient functions can be employed as a valid tool to learn underlying structures, including sparse learning, interaction selection and model identification, and it is easy to implement by taking advantage of the nice properties of the RKHS. More importantly, it admits a wide range of loss functions, and thus includes many commonly used methods, such as mean regression, quantile regression, likelihood-based classification, and margin-based classification, which is also computationally efficient by solving convex optimization tasks. The asymptotic results of the proposed framework are established within a rich family of loss functions without any explicit model specifications. The superior performance of the proposed framework is also demonstrated by a variety of simulated examples and a real case study.
doi_str_mv 10.48550/arxiv.1901.00615
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1901_00615</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1901_00615</sourcerecordid><originalsourceid>FETCH-LOGICAL-a675-e29c6cf36439f2ae759faa3f8c71665c265c9d94f472aafdea4238db1324b8e63</originalsourceid><addsrcrecordid>eNo1j8tOwzAURL3pAhU-gBX5AQe_Ey9YoIpHpSIWdB_dONeV1dRF100Ff09pYTEajUYazWHsVoratNaKe6CvdKylF7IWwkl7xR4-DjSFw0RYjQiUU95UxwTVlMt_MVRbpIwj76Gcwhsn3BCWkvb5ms0ijAVv_nzO1s9P68UrX72_LBePKw6usRyVDy5E7Yz2UQE21kcAHdvQSOdsUCf5wZtoGgUQBwSjdDv0UivTt-j0nN1dZs__u09KO6Dv7pejO3PoH8SBQ80</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Structure learning via unstructured kernel-based M-regression</title><source>arXiv.org</source><creator>He, Xin ; Ge, Yeheng ; Feng, Xingdong</creator><creatorcontrib>He, Xin ; Ge, Yeheng ; Feng, Xingdong</creatorcontrib><description>In statistical learning, identifying underlying structures of true target functions based on observed data plays a crucial role to facilitate subsequent modeling and analysis. Unlike most of those existing methods that focus on some specific settings under certain model assumptions, this paper proposes a general and novel framework for recovering true structures of target functions by using unstructured M-regression in a reproducing kernel Hilbert space (RKHS). The proposed framework is inspired by the fact that gradient functions can be employed as a valid tool to learn underlying structures, including sparse learning, interaction selection and model identification, and it is easy to implement by taking advantage of the nice properties of the RKHS. More importantly, it admits a wide range of loss functions, and thus includes many commonly used methods, such as mean regression, quantile regression, likelihood-based classification, and margin-based classification, which is also computationally efficient by solving convex optimization tasks. The asymptotic results of the proposed framework are established within a rich family of loss functions without any explicit model specifications. The superior performance of the proposed framework is also demonstrated by a variety of simulated examples and a real case study.</description><identifier>DOI: 10.48550/arxiv.1901.00615</identifier><language>eng</language><subject>Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2019-01</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1901.00615$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1901.00615$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>He, Xin</creatorcontrib><creatorcontrib>Ge, Yeheng</creatorcontrib><creatorcontrib>Feng, Xingdong</creatorcontrib><title>Structure learning via unstructured kernel-based M-regression</title><description>In statistical learning, identifying underlying structures of true target functions based on observed data plays a crucial role to facilitate subsequent modeling and analysis. Unlike most of those existing methods that focus on some specific settings under certain model assumptions, this paper proposes a general and novel framework for recovering true structures of target functions by using unstructured M-regression in a reproducing kernel Hilbert space (RKHS). The proposed framework is inspired by the fact that gradient functions can be employed as a valid tool to learn underlying structures, including sparse learning, interaction selection and model identification, and it is easy to implement by taking advantage of the nice properties of the RKHS. More importantly, it admits a wide range of loss functions, and thus includes many commonly used methods, such as mean regression, quantile regression, likelihood-based classification, and margin-based classification, which is also computationally efficient by solving convex optimization tasks. The asymptotic results of the proposed framework are established within a rich family of loss functions without any explicit model specifications. The superior performance of the proposed framework is also demonstrated by a variety of simulated examples and a real case study.</description><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNo1j8tOwzAURL3pAhU-gBX5AQe_Ey9YoIpHpSIWdB_dONeV1dRF100Ff09pYTEajUYazWHsVoratNaKe6CvdKylF7IWwkl7xR4-DjSFw0RYjQiUU95UxwTVlMt_MVRbpIwj76Gcwhsn3BCWkvb5ms0ijAVv_nzO1s9P68UrX72_LBePKw6usRyVDy5E7Yz2UQE21kcAHdvQSOdsUCf5wZtoGgUQBwSjdDv0UivTt-j0nN1dZs__u09KO6Dv7pejO3PoH8SBQ80</recordid><startdate>20190103</startdate><enddate>20190103</enddate><creator>He, Xin</creator><creator>Ge, Yeheng</creator><creator>Feng, Xingdong</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20190103</creationdate><title>Structure learning via unstructured kernel-based M-regression</title><author>He, Xin ; Ge, Yeheng ; Feng, Xingdong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a675-e29c6cf36439f2ae759faa3f8c71665c265c9d94f472aafdea4238db1324b8e63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>He, Xin</creatorcontrib><creatorcontrib>Ge, Yeheng</creatorcontrib><creatorcontrib>Feng, Xingdong</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>He, Xin</au><au>Ge, Yeheng</au><au>Feng, Xingdong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Structure learning via unstructured kernel-based M-regression</atitle><date>2019-01-03</date><risdate>2019</risdate><abstract>In statistical learning, identifying underlying structures of true target functions based on observed data plays a crucial role to facilitate subsequent modeling and analysis. Unlike most of those existing methods that focus on some specific settings under certain model assumptions, this paper proposes a general and novel framework for recovering true structures of target functions by using unstructured M-regression in a reproducing kernel Hilbert space (RKHS). The proposed framework is inspired by the fact that gradient functions can be employed as a valid tool to learn underlying structures, including sparse learning, interaction selection and model identification, and it is easy to implement by taking advantage of the nice properties of the RKHS. More importantly, it admits a wide range of loss functions, and thus includes many commonly used methods, such as mean regression, quantile regression, likelihood-based classification, and margin-based classification, which is also computationally efficient by solving convex optimization tasks. The asymptotic results of the proposed framework are established within a rich family of loss functions without any explicit model specifications. The superior performance of the proposed framework is also demonstrated by a variety of simulated examples and a real case study.</abstract><doi>10.48550/arxiv.1901.00615</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1901.00615
ispartof
issn
language eng
recordid cdi_arxiv_primary_1901_00615
source arXiv.org
subjects Computer Science - Learning
Statistics - Machine Learning
title Structure learning via unstructured kernel-based M-regression
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T11%3A09%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Structure%20learning%20via%20unstructured%20kernel-based%20M-regression&rft.au=He,%20Xin&rft.date=2019-01-03&rft_id=info:doi/10.48550/arxiv.1901.00615&rft_dat=%3Carxiv_GOX%3E1901_00615%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true