Learning Rates of Least-Square Regularized Regression
This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates....
Gespeichert in:
Veröffentlicht in: | Foundations of computational mathematics 2006-04, Vol.6 (2), p.171-192 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 192 |
---|---|
container_issue | 2 |
container_start_page | 171 |
container_title | Foundations of computational mathematics |
container_volume | 6 |
creator | Wu, Qiang Ying, Yiming Zhou, Ding-Xuan |
description | This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C[inf] and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is m[zeta] with [zeta] arbitrarily close to 1, regardless of the variance of the bounded probability distribution. [PUBLICATION ABSTRACT] |
doi_str_mv | 10.1007/s10208-004-0155-9 |
format | Article |
fullrecord | <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_journals_203763970</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A154722423</galeid><sourcerecordid>A154722423</sourcerecordid><originalsourceid>FETCH-LOGICAL-c415t-566f63bf410e377b9b8881f6f811f502533525862a855c93fb20e844f563945e3</originalsourceid><addsrcrecordid>eNo9kF1LwzAUhoMoOKc_wLvinRfRc5KcNL0cw4_BQNj0OqQ1KR2znUkL6q-3o-LVeTk8vC88jF0j3CFAfp8QBBgOoDggES9O2Aw1EpfSyNP_nNM5u0hpByNUoJoxWnsX26ats43rfcq6kI2f1PPt5-Cizza-HvYuNj_-_ZijT6np2kt2Ftw--au_O2dvjw-vy2e-fnlaLRdrXimknpPWQcsyKAQv87wsSmMMBh0MYiAQJCUJMlo4Q1QVMpQCvFEqkJaFIi_n7GbqPcTuc_Cpt7tuiO04aQXIfKRyGKHbCard3tumrbq291997YaU7Gq7sQsklQuhhBxZnNgqdilFH-whNh8uflsEexRpJ5F2FGmPIm0hfwFXa2Js</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>203763970</pqid></control><display><type>article</type><title>Learning Rates of Least-Square Regularized Regression</title><source>SpringerLink Journals</source><creator>Wu, Qiang ; Ying, Yiming ; Zhou, Ding-Xuan</creator><creatorcontrib>Wu, Qiang ; Ying, Yiming ; Zhou, Ding-Xuan</creatorcontrib><description>This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C[inf] and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is m[zeta] with [zeta] arbitrarily close to 1, regardless of the variance of the bounded probability distribution. [PUBLICATION ABSTRACT]</description><identifier>ISSN: 1615-3375</identifier><identifier>EISSN: 1615-3383</identifier><identifier>DOI: 10.1007/s10208-004-0155-9</identifier><identifier>CODEN: FCMOA3</identifier><language>eng</language><publisher>New York: Springer</publisher><subject>Algorithms ; Analysis ; Error analysis ; Hilbert space ; Least squares ; Measurement ; Regression analysis</subject><ispartof>Foundations of computational mathematics, 2006-04, Vol.6 (2), p.171-192</ispartof><rights>COPYRIGHT 2006 Springer</rights><rights>Springer 2006</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c415t-566f63bf410e377b9b8881f6f811f502533525862a855c93fb20e844f563945e3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids></links><search><creatorcontrib>Wu, Qiang</creatorcontrib><creatorcontrib>Ying, Yiming</creatorcontrib><creatorcontrib>Zhou, Ding-Xuan</creatorcontrib><title>Learning Rates of Least-Square Regularized Regression</title><title>Foundations of computational mathematics</title><description>This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C[inf] and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is m[zeta] with [zeta] arbitrarily close to 1, regardless of the variance of the bounded probability distribution. [PUBLICATION ABSTRACT]</description><subject>Algorithms</subject><subject>Analysis</subject><subject>Error analysis</subject><subject>Hilbert space</subject><subject>Least squares</subject><subject>Measurement</subject><subject>Regression analysis</subject><issn>1615-3375</issn><issn>1615-3383</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2006</creationdate><recordtype>article</recordtype><recordid>eNo9kF1LwzAUhoMoOKc_wLvinRfRc5KcNL0cw4_BQNj0OqQ1KR2znUkL6q-3o-LVeTk8vC88jF0j3CFAfp8QBBgOoDggES9O2Aw1EpfSyNP_nNM5u0hpByNUoJoxWnsX26ats43rfcq6kI2f1PPt5-Cizza-HvYuNj_-_ZijT6np2kt2Ftw--au_O2dvjw-vy2e-fnlaLRdrXimknpPWQcsyKAQv87wsSmMMBh0MYiAQJCUJMlo4Q1QVMpQCvFEqkJaFIi_n7GbqPcTuc_Cpt7tuiO04aQXIfKRyGKHbCard3tumrbq291997YaU7Gq7sQsklQuhhBxZnNgqdilFH-whNh8uflsEexRpJ5F2FGmPIm0hfwFXa2Js</recordid><startdate>20060401</startdate><enddate>20060401</enddate><creator>Wu, Qiang</creator><creator>Ying, Yiming</creator><creator>Zhou, Ding-Xuan</creator><general>Springer</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>ISR</scope><scope>7SC</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20060401</creationdate><title>Learning Rates of Least-Square Regularized Regression</title><author>Wu, Qiang ; Ying, Yiming ; Zhou, Ding-Xuan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c415t-566f63bf410e377b9b8881f6f811f502533525862a855c93fb20e844f563945e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2006</creationdate><topic>Algorithms</topic><topic>Analysis</topic><topic>Error analysis</topic><topic>Hilbert space</topic><topic>Least squares</topic><topic>Measurement</topic><topic>Regression analysis</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wu, Qiang</creatorcontrib><creatorcontrib>Ying, Yiming</creatorcontrib><creatorcontrib>Zhou, Ding-Xuan</creatorcontrib><collection>CrossRef</collection><collection>Gale In Context: Science</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Foundations of computational mathematics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wu, Qiang</au><au>Ying, Yiming</au><au>Zhou, Ding-Xuan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Learning Rates of Least-Square Regularized Regression</atitle><jtitle>Foundations of computational mathematics</jtitle><date>2006-04-01</date><risdate>2006</risdate><volume>6</volume><issue>2</issue><spage>171</spage><epage>192</epage><pages>171-192</pages><issn>1615-3375</issn><eissn>1615-3383</eissn><coden>FCMOA3</coden><abstract>This paper considers the regularized learning algorithm associated with the least-square loss and reproducing kernel Hilbert spaces. The target is the error analysis for the regression problem in learning theory. A novel regularization approach is presented, which yields satisfactory learning rates. The rates depend on the approximation property and on the capacity of the reproducing kernel Hilbert space measured by covering numbers. When the kernel is C[inf] and the regression function lies in the corresponding reproducing kernel Hilbert space, the rate is m[zeta] with [zeta] arbitrarily close to 1, regardless of the variance of the bounded probability distribution. [PUBLICATION ABSTRACT]</abstract><cop>New York</cop><pub>Springer</pub><doi>10.1007/s10208-004-0155-9</doi><tpages>22</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1615-3375 |
ispartof | Foundations of computational mathematics, 2006-04, Vol.6 (2), p.171-192 |
issn | 1615-3375 1615-3383 |
language | eng |
recordid | cdi_proquest_journals_203763970 |
source | SpringerLink Journals |
subjects | Algorithms Analysis Error analysis Hilbert space Least squares Measurement Regression analysis |
title | Learning Rates of Least-Square Regularized Regression |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T06%3A09%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Learning%20Rates%20of%20Least-Square%20Regularized%20Regression&rft.jtitle=Foundations%20of%20computational%20mathematics&rft.au=Wu,%20Qiang&rft.date=2006-04-01&rft.volume=6&rft.issue=2&rft.spage=171&rft.epage=192&rft.pages=171-192&rft.issn=1615-3375&rft.eissn=1615-3383&rft.coden=FCMOA3&rft_id=info:doi/10.1007/s10208-004-0155-9&rft_dat=%3Cgale_proqu%3EA154722423%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=203763970&rft_id=info:pmid/&rft_galeid=A154722423&rfr_iscdi=true |