Generalization errors of Laplacian regularized least squares regression

Semi-supervised learning is an emerging computational paradigm for machine learning, that aims to make better use of large amounts of inexpensive unlabeled data to improve the learning performance. While various methods have been proposed based on different intuitions, the crucial issue of generaliz...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Science China. Mathematics 2012-09, Vol.55 (9), p.1859-1868
Hauptverfasser: Cao, Ying, Chen, DiRong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1868
container_issue 9
container_start_page 1859
container_title Science China. Mathematics
container_volume 55
creator Cao, Ying
Chen, DiRong
description Semi-supervised learning is an emerging computational paradigm for machine learning, that aims to make better use of large amounts of inexpensive unlabeled data to improve the learning performance. While various methods have been proposed based on different intuitions, the crucial issue of generalization performance is still poorly understood. In this paper, we investigate the convergence property of the Laplacian regularized least squares regression, a semi-supervised learning algorithm based on manifold regularization. Moreover, the improvement of error bounds in terms of the number of labeled and unlabeled data is presented for the first time as far as we know. The convergence rate depends on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. Some new techniques are exploited for the analysis since an extra regularizer is introduced.
doi_str_mv 10.1007/s11425-012-4438-3
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1283658272</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><cqvip_id>42998062</cqvip_id><sourcerecordid>1671568758</sourcerecordid><originalsourceid>FETCH-LOGICAL-c413t-1743ae96681f46f02e06d198519f7b8762a7e25b177d30dcd01c5cbf4d48269c3</originalsourceid><addsrcrecordid>eNqNkT1PwzAQhi0EEhX0B7CFjcXg748RVVCQKrHAbLnOpaRK49ZuBvrrcZWKEfDgs3TPe-e7F6EbSu4pIfohUyqYxIQyLAQ3mJ-hCTXK4nKx8_JWWmDNDL9E05zXpBxuidB8guZz6CH5rj34fRv7ClKKKVexqRZ-2_nQ-r5KsBo6n9oD1FUHPu-rvBt8gnzMlJCL8BpdNL7LMD3FK_Tx_PQ-e8GLt_nr7HGBg6B8j6kW3INVytBGqIYwIKqm1khqG700WjGvgckl1brmpA41oUGGZSNqYZiygV-hu7HuNsXdAHnvNm0O0HW-hzhkV0alUhktzd-oLN9g1oh_oGV3ShqmWUHpiIYUc07QuG1qNz59OUrc0Q03uuGKG-7ohuNFw0ZNLmy_guTWcUh92dOvottTo8_Yr3ZF99NJMGsNUYx_A98Plq8</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1283658272</pqid></control><display><type>article</type><title>Generalization errors of Laplacian regularized least squares regression</title><source>SpringerLink Journals (MCLS)</source><source>Alma/SFX Local Collection</source><creator>Cao, Ying ; Chen, DiRong</creator><creatorcontrib>Cao, Ying ; Chen, DiRong</creatorcontrib><description>Semi-supervised learning is an emerging computational paradigm for machine learning, that aims to make better use of large amounts of inexpensive unlabeled data to improve the learning performance. While various methods have been proposed based on different intuitions, the crucial issue of generalization performance is still poorly understood. In this paper, we investigate the convergence property of the Laplacian regularized least squares regression, a semi-supervised learning algorithm based on manifold regularization. Moreover, the improvement of error bounds in terms of the number of labeled and unlabeled data is presented for the first time as far as we know. The convergence rate depends on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. Some new techniques are exploited for the analysis since an extra regularizer is introduced.</description><identifier>ISSN: 1674-7283</identifier><identifier>ISSN: 1006-9283</identifier><identifier>EISSN: 1869-1862</identifier><identifier>DOI: 10.1007/s11425-012-4438-3</identifier><language>eng</language><publisher>Heidelberg: SP Science China Press</publisher><subject>Algorithms ; Applications of Mathematics ; China ; Convergence ; Errors ; Hilbert space ; Learning ; Least squares method ; Mathematics ; Mathematics and Statistics ; Regression ; 再生核Hilbert空间 ; 半监督学习 ; 学习算法 ; 拉普拉斯算子 ; 最小二乘回归 ; 机器学习 ; 正则化 ; 泛化性能</subject><ispartof>Science China. Mathematics, 2012-09, Vol.55 (9), p.1859-1868</ispartof><rights>Science China Press and Springer-Verlag Berlin Heidelberg 2012</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c413t-1743ae96681f46f02e06d198519f7b8762a7e25b177d30dcd01c5cbf4d48269c3</citedby><cites>FETCH-LOGICAL-c413t-1743ae96681f46f02e06d198519f7b8762a7e25b177d30dcd01c5cbf4d48269c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Uhttp://image.cqvip.com/vip1000/qk/60114X/60114X.jpg</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11425-012-4438-3$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11425-012-4438-3$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Cao, Ying</creatorcontrib><creatorcontrib>Chen, DiRong</creatorcontrib><title>Generalization errors of Laplacian regularized least squares regression</title><title>Science China. Mathematics</title><addtitle>Sci. China Math</addtitle><addtitle>SCIENCE CHINA Mathematics</addtitle><description>Semi-supervised learning is an emerging computational paradigm for machine learning, that aims to make better use of large amounts of inexpensive unlabeled data to improve the learning performance. While various methods have been proposed based on different intuitions, the crucial issue of generalization performance is still poorly understood. In this paper, we investigate the convergence property of the Laplacian regularized least squares regression, a semi-supervised learning algorithm based on manifold regularization. Moreover, the improvement of error bounds in terms of the number of labeled and unlabeled data is presented for the first time as far as we know. The convergence rate depends on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. Some new techniques are exploited for the analysis since an extra regularizer is introduced.</description><subject>Algorithms</subject><subject>Applications of Mathematics</subject><subject>China</subject><subject>Convergence</subject><subject>Errors</subject><subject>Hilbert space</subject><subject>Learning</subject><subject>Least squares method</subject><subject>Mathematics</subject><subject>Mathematics and Statistics</subject><subject>Regression</subject><subject>再生核Hilbert空间</subject><subject>半监督学习</subject><subject>学习算法</subject><subject>拉普拉斯算子</subject><subject>最小二乘回归</subject><subject>机器学习</subject><subject>正则化</subject><subject>泛化性能</subject><issn>1674-7283</issn><issn>1006-9283</issn><issn>1869-1862</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2012</creationdate><recordtype>article</recordtype><recordid>eNqNkT1PwzAQhi0EEhX0B7CFjcXg748RVVCQKrHAbLnOpaRK49ZuBvrrcZWKEfDgs3TPe-e7F6EbSu4pIfohUyqYxIQyLAQ3mJ-hCTXK4nKx8_JWWmDNDL9E05zXpBxuidB8guZz6CH5rj34fRv7ClKKKVexqRZ-2_nQ-r5KsBo6n9oD1FUHPu-rvBt8gnzMlJCL8BpdNL7LMD3FK_Tx_PQ-e8GLt_nr7HGBg6B8j6kW3INVytBGqIYwIKqm1khqG700WjGvgckl1brmpA41oUGGZSNqYZiygV-hu7HuNsXdAHnvNm0O0HW-hzhkV0alUhktzd-oLN9g1oh_oGV3ShqmWUHpiIYUc07QuG1qNz59OUrc0Q03uuGKG-7ohuNFw0ZNLmy_guTWcUh92dOvottTo8_Yr3ZF99NJMGsNUYx_A98Plq8</recordid><startdate>20120901</startdate><enddate>20120901</enddate><creator>Cao, Ying</creator><creator>Chen, DiRong</creator><general>SP Science China Press</general><scope>2RA</scope><scope>92L</scope><scope>CQIGP</scope><scope>~WA</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>KR7</scope><scope>7TG</scope><scope>KL.</scope></search><sort><creationdate>20120901</creationdate><title>Generalization errors of Laplacian regularized least squares regression</title><author>Cao, Ying ; Chen, DiRong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c413t-1743ae96681f46f02e06d198519f7b8762a7e25b177d30dcd01c5cbf4d48269c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Algorithms</topic><topic>Applications of Mathematics</topic><topic>China</topic><topic>Convergence</topic><topic>Errors</topic><topic>Hilbert space</topic><topic>Learning</topic><topic>Least squares method</topic><topic>Mathematics</topic><topic>Mathematics and Statistics</topic><topic>Regression</topic><topic>再生核Hilbert空间</topic><topic>半监督学习</topic><topic>学习算法</topic><topic>拉普拉斯算子</topic><topic>最小二乘回归</topic><topic>机器学习</topic><topic>正则化</topic><topic>泛化性能</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cao, Ying</creatorcontrib><creatorcontrib>Chen, DiRong</creatorcontrib><collection>中文科技期刊数据库</collection><collection>中文科技期刊数据库-CALIS站点</collection><collection>中文科技期刊数据库-7.0平台</collection><collection>中文科技期刊数据库- 镜像站点</collection><collection>CrossRef</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>Civil Engineering Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><jtitle>Science China. Mathematics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cao, Ying</au><au>Chen, DiRong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Generalization errors of Laplacian regularized least squares regression</atitle><jtitle>Science China. Mathematics</jtitle><stitle>Sci. China Math</stitle><addtitle>SCIENCE CHINA Mathematics</addtitle><date>2012-09-01</date><risdate>2012</risdate><volume>55</volume><issue>9</issue><spage>1859</spage><epage>1868</epage><pages>1859-1868</pages><issn>1674-7283</issn><issn>1006-9283</issn><eissn>1869-1862</eissn><abstract>Semi-supervised learning is an emerging computational paradigm for machine learning, that aims to make better use of large amounts of inexpensive unlabeled data to improve the learning performance. While various methods have been proposed based on different intuitions, the crucial issue of generalization performance is still poorly understood. In this paper, we investigate the convergence property of the Laplacian regularized least squares regression, a semi-supervised learning algorithm based on manifold regularization. Moreover, the improvement of error bounds in terms of the number of labeled and unlabeled data is presented for the first time as far as we know. The convergence rate depends on the approximation property and the capacity of the reproducing kernel Hilbert space measured by covering numbers. Some new techniques are exploited for the analysis since an extra regularizer is introduced.</abstract><cop>Heidelberg</cop><pub>SP Science China Press</pub><doi>10.1007/s11425-012-4438-3</doi><tpages>10</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1674-7283
ispartof Science China. Mathematics, 2012-09, Vol.55 (9), p.1859-1868
issn 1674-7283
1006-9283
1869-1862
language eng
recordid cdi_proquest_miscellaneous_1283658272
source SpringerLink Journals (MCLS); Alma/SFX Local Collection
subjects Algorithms
Applications of Mathematics
China
Convergence
Errors
Hilbert space
Learning
Least squares method
Mathematics
Mathematics and Statistics
Regression
再生核Hilbert空间
半监督学习
学习算法
拉普拉斯算子
最小二乘回归
机器学习
正则化
泛化性能
title Generalization errors of Laplacian regularized least squares regression
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T21%3A29%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Generalization%20errors%20of%20Laplacian%20regularized%20least%20squares%20regression&rft.jtitle=Science%20China.%20Mathematics&rft.au=Cao,%20Ying&rft.date=2012-09-01&rft.volume=55&rft.issue=9&rft.spage=1859&rft.epage=1868&rft.pages=1859-1868&rft.issn=1674-7283&rft.eissn=1869-1862&rft_id=info:doi/10.1007/s11425-012-4438-3&rft_dat=%3Cproquest_cross%3E1671568758%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1283658272&rft_id=info:pmid/&rft_cqvip_id=42998062&rfr_iscdi=true