Distribution-free uncertainty quantification for kernel methods by gradient perturbations
We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild reg...
Gespeichert in:
Veröffentlicht in: | Machine learning 2019-09, Vol.108 (8-9), p.1677-1699 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1699 |
---|---|
container_issue | 8-9 |
container_start_page | 1677 |
container_title | Machine learning |
container_volume | 108 |
creator | Csáji, Balázs Cs Kis, Krisztián B. |
description | We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild regularity of the measurement noise, such as it is being symmetric or exchangeable. We show, by building on recent results from finite-sample system identification, that by perturbing the residuals in the gradient of the objective function, information can be extracted about the amount of uncertainty our model has. Particularly, we provide an algorithm to build exact, non-asymptotically guaranteed, distribution-free confidence regions for ideal, noise-free representations of the function we try to estimate. For the typical convex quadratic problems and symmetric noises, the regions are star convex centered around a given nominal estimate, and have efficient ellipsoidal outer approximations. Finally, we illustrate the ideas on typical kernel methods, such as LS-SVC, KRR,
ε
-SVR and kernelized LASSO. |
doi_str_mv | 10.1007/s10994-019-05822-1 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2248023969</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2248023969</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-800c8f3f0b8a4b98b55ccefdd1ea0a85615f7453fc3056670445bcdd98b035163</originalsourceid><addsrcrecordid>eNp9kE1PAyEQhonRxFr9A55IPK8OsLDs0dTPpIkXPXgiLAuV2rItsIf-e2lr4s3TZDLPO5N5ELomcEsAmrtEoG3rCkhbAZeUVuQETQhvWGkFP0UTkJJXglB-ji5SWgIAFVJM0OeDTzn6bsx-CJWL1uIxGBuz9iHv8HbUIXvnjd7PsRsi_rYx2BVe2_w19Al3O7yIuvc2ZLwpuTF2BzZdojOnV8le_dYp-nh6fJ-9VPO359fZ_bwyTLBcSQAjHXPQSV13rew4N8a6vidWg5ZcEO6amjNnWHlFNFDXvDN9X0hgnAg2RTfHvZs4bEebsloOYwzlpKK0lkBZK9pC0SNl4pBStE5tol_ruFME1F6hOipURaE6KFSkhNgxlAocFjb-rf4n9QPPtXXs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2248023969</pqid></control><display><type>article</type><title>Distribution-free uncertainty quantification for kernel methods by gradient perturbations</title><source>SpringerNature Journals</source><creator>Csáji, Balázs Cs ; Kis, Krisztián B.</creator><creatorcontrib>Csáji, Balázs Cs ; Kis, Krisztián B.</creatorcontrib><description>We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild regularity of the measurement noise, such as it is being symmetric or exchangeable. We show, by building on recent results from finite-sample system identification, that by perturbing the residuals in the gradient of the objective function, information can be extracted about the amount of uncertainty our model has. Particularly, we provide an algorithm to build exact, non-asymptotically guaranteed, distribution-free confidence regions for ideal, noise-free representations of the function we try to estimate. For the typical convex quadratic problems and symmetric noises, the regions are star convex centered around a given nominal estimate, and have efficient ellipsoidal outer approximations. Finally, we illustrate the ideas on typical kernel methods, such as LS-SVC, KRR,
ε
-SVR and kernelized LASSO.</description><identifier>ISSN: 0885-6125</identifier><identifier>EISSN: 1573-0565</identifier><identifier>DOI: 10.1007/s10994-019-05822-1</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Approximation ; Artificial Intelligence ; Computer Science ; Control ; Estimating techniques ; Gaussian process ; Kernels ; Mechatronics ; Natural Language Processing (NLP) ; Noise measurement ; Nonparametric statistics ; Robotics ; Simulation and Modeling ; Special Issue of the ECML PKDD 2019 Journal Track ; System identification ; Uncertainty</subject><ispartof>Machine learning, 2019-09, Vol.108 (8-9), p.1677-1699</ispartof><rights>The Author(s) 2019</rights><rights>Machine Learning is a copyright of Springer, (2019). All Rights Reserved. © 2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-800c8f3f0b8a4b98b55ccefdd1ea0a85615f7453fc3056670445bcdd98b035163</citedby><cites>FETCH-LOGICAL-c363t-800c8f3f0b8a4b98b55ccefdd1ea0a85615f7453fc3056670445bcdd98b035163</cites><orcidid>0000-0001-7079-8343</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10994-019-05822-1$$EPDF$$P50$$Gspringer$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10994-019-05822-1$$EHTML$$P50$$Gspringer$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Csáji, Balázs Cs</creatorcontrib><creatorcontrib>Kis, Krisztián B.</creatorcontrib><title>Distribution-free uncertainty quantification for kernel methods by gradient perturbations</title><title>Machine learning</title><addtitle>Mach Learn</addtitle><description>We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild regularity of the measurement noise, such as it is being symmetric or exchangeable. We show, by building on recent results from finite-sample system identification, that by perturbing the residuals in the gradient of the objective function, information can be extracted about the amount of uncertainty our model has. Particularly, we provide an algorithm to build exact, non-asymptotically guaranteed, distribution-free confidence regions for ideal, noise-free representations of the function we try to estimate. For the typical convex quadratic problems and symmetric noises, the regions are star convex centered around a given nominal estimate, and have efficient ellipsoidal outer approximations. Finally, we illustrate the ideas on typical kernel methods, such as LS-SVC, KRR,
ε
-SVR and kernelized LASSO.</description><subject>Algorithms</subject><subject>Approximation</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Control</subject><subject>Estimating techniques</subject><subject>Gaussian process</subject><subject>Kernels</subject><subject>Mechatronics</subject><subject>Natural Language Processing (NLP)</subject><subject>Noise measurement</subject><subject>Nonparametric statistics</subject><subject>Robotics</subject><subject>Simulation and Modeling</subject><subject>Special Issue of the ECML PKDD 2019 Journal Track</subject><subject>System identification</subject><subject>Uncertainty</subject><issn>0885-6125</issn><issn>1573-0565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>C6C</sourceid><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE1PAyEQhonRxFr9A55IPK8OsLDs0dTPpIkXPXgiLAuV2rItsIf-e2lr4s3TZDLPO5N5ELomcEsAmrtEoG3rCkhbAZeUVuQETQhvWGkFP0UTkJJXglB-ji5SWgIAFVJM0OeDTzn6bsx-CJWL1uIxGBuz9iHv8HbUIXvnjd7PsRsi_rYx2BVe2_w19Al3O7yIuvc2ZLwpuTF2BzZdojOnV8le_dYp-nh6fJ-9VPO359fZ_bwyTLBcSQAjHXPQSV13rew4N8a6vidWg5ZcEO6amjNnWHlFNFDXvDN9X0hgnAg2RTfHvZs4bEebsloOYwzlpKK0lkBZK9pC0SNl4pBStE5tol_ruFME1F6hOipURaE6KFSkhNgxlAocFjb-rf4n9QPPtXXs</recordid><startdate>20190901</startdate><enddate>20190901</enddate><creator>Csáji, Balázs Cs</creator><creator>Kis, Krisztián B.</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M2P</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0001-7079-8343</orcidid></search><sort><creationdate>20190901</creationdate><title>Distribution-free uncertainty quantification for kernel methods by gradient perturbations</title><author>Csáji, Balázs Cs ; Kis, Krisztián B.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-800c8f3f0b8a4b98b55ccefdd1ea0a85615f7453fc3056670445bcdd98b035163</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Algorithms</topic><topic>Approximation</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Control</topic><topic>Estimating techniques</topic><topic>Gaussian process</topic><topic>Kernels</topic><topic>Mechatronics</topic><topic>Natural Language Processing (NLP)</topic><topic>Noise measurement</topic><topic>Nonparametric statistics</topic><topic>Robotics</topic><topic>Simulation and Modeling</topic><topic>Special Issue of the ECML PKDD 2019 Journal Track</topic><topic>System identification</topic><topic>Uncertainty</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Csáji, Balázs Cs</creatorcontrib><creatorcontrib>Kis, Krisztián B.</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Machine learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Csáji, Balázs Cs</au><au>Kis, Krisztián B.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Distribution-free uncertainty quantification for kernel methods by gradient perturbations</atitle><jtitle>Machine learning</jtitle><stitle>Mach Learn</stitle><date>2019-09-01</date><risdate>2019</risdate><volume>108</volume><issue>8-9</issue><spage>1677</spage><epage>1699</epage><pages>1677-1699</pages><issn>0885-6125</issn><eissn>1573-0565</eissn><abstract>We propose a data-driven approach to quantify the uncertainty of models constructed by kernel methods. Our approach minimizes the needed distributional assumptions, hence, instead of working with, for example, Gaussian processes or exponential families, it only requires knowledge about some mild regularity of the measurement noise, such as it is being symmetric or exchangeable. We show, by building on recent results from finite-sample system identification, that by perturbing the residuals in the gradient of the objective function, information can be extracted about the amount of uncertainty our model has. Particularly, we provide an algorithm to build exact, non-asymptotically guaranteed, distribution-free confidence regions for ideal, noise-free representations of the function we try to estimate. For the typical convex quadratic problems and symmetric noises, the regions are star convex centered around a given nominal estimate, and have efficient ellipsoidal outer approximations. Finally, we illustrate the ideas on typical kernel methods, such as LS-SVC, KRR,
ε
-SVR and kernelized LASSO.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10994-019-05822-1</doi><tpages>23</tpages><orcidid>https://orcid.org/0000-0001-7079-8343</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0885-6125 |
ispartof | Machine learning, 2019-09, Vol.108 (8-9), p.1677-1699 |
issn | 0885-6125 1573-0565 |
language | eng |
recordid | cdi_proquest_journals_2248023969 |
source | SpringerNature Journals |
subjects | Algorithms Approximation Artificial Intelligence Computer Science Control Estimating techniques Gaussian process Kernels Mechatronics Natural Language Processing (NLP) Noise measurement Nonparametric statistics Robotics Simulation and Modeling Special Issue of the ECML PKDD 2019 Journal Track System identification Uncertainty |
title | Distribution-free uncertainty quantification for kernel methods by gradient perturbations |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T03%3A14%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Distribution-free%20uncertainty%20quantification%20for%20kernel%20methods%20by%20gradient%20perturbations&rft.jtitle=Machine%20learning&rft.au=Cs%C3%A1ji,%20Bal%C3%A1zs%20Cs&rft.date=2019-09-01&rft.volume=108&rft.issue=8-9&rft.spage=1677&rft.epage=1699&rft.pages=1677-1699&rft.issn=0885-6125&rft.eissn=1573-0565&rft_id=info:doi/10.1007/s10994-019-05822-1&rft_dat=%3Cproquest_cross%3E2248023969%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2248023969&rft_id=info:pmid/&rfr_iscdi=true |