Robust Regularized Kernel Regression
Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formula...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cybernetics 2008-12, Vol.38 (6), p.1639-1644 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1644 |
---|---|
container_issue | 6 |
container_start_page | 1639 |
container_title | IEEE transactions on cybernetics |
container_volume | 38 |
creator | Jianke Zhu Hoi, S. Lyu, M.R.-T. |
description | Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches. |
doi_str_mv | 10.1109/TSMCB.2008.927279 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pubmed_primary_19022733</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4669534</ieee_id><sourcerecordid>69813463</sourcerecordid><originalsourceid>FETCH-LOGICAL-c470t-cf0d6aea21d21bc42140aa0000b2fb5c5686037420bc0212163788beb68b87623</originalsourceid><addsrcrecordid>eNp9kEtLAzEQgIMotlZ_gAgiIuJl60ySzeOoxRdWhFrPYbOblS3bbk26B_31pm5R8OBphplvZpiPkEOEISLoy-nL0-h6SAHUUFNJpd4ifdQcE-CabsccFEs4R90jeyHMAECDlrukhxoolYz1ydmksW1YnUzcW1tnvvp0xcmj8wtXr0vehVA1i32yU2Z1cAebOCCvtzfT0X0yfr57GF2Nk5xLWCV5CYXIXEaxoGhzTpFDlsWrYGlp0zwVSgCTnILNgSJFwaRS1lmhrJKCsgE57_YuffPeurAy8yrkrq6zhWvaYIRWyLhgEbz4F0QhkVGQQkf09A86a1q_iG8YFU-miFxGCDso900I3pVm6at55j8MglmrNt-qzVq16VTHmePN4tbOXfE7sXEbgaMOqJxzP20uhE4ZZ1-wo38W</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>862351147</pqid></control><display><type>article</type><title>Robust Regularized Kernel Regression</title><source>IEEE Electronic Library (IEL)</source><creator>Jianke Zhu ; Hoi, S. ; Lyu, M.R.-T.</creator><creatorcontrib>Jianke Zhu ; Hoi, S. ; Lyu, M.R.-T.</creatorcontrib><description>Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.</description><identifier>ISSN: 1083-4419</identifier><identifier>ISSN: 2168-2267</identifier><identifier>EISSN: 1941-0492</identifier><identifier>EISSN: 2168-2275</identifier><identifier>DOI: 10.1109/TSMCB.2008.927279</identifier><identifier>PMID: 19022733</identifier><identifier>CODEN: ITSCFI</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Algorithms ; Artificial Intelligence ; Computer Simulation ; Cybernetics ; Data Interpretation, Statistical ; Data mining ; Equivalence ; Formulations ; History ; Kernel ; Kernel regression ; Kernels ; Least squares approximation ; Least squares methods ; Mathematical models ; Mathematics ; Models, Statistical ; Noise robustness ; Optimization ; Pattern Recognition, Automated - methods ; Regression ; Regression Analysis ; regularized least squares (RLS) ; Resonance light scattering ; robust estimator ; Solvers ; Statistics ; Studies ; support vector machine (SVM) ; Support vector machines</subject><ispartof>IEEE transactions on cybernetics, 2008-12, Vol.38 (6), p.1639-1644</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2008</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c470t-cf0d6aea21d21bc42140aa0000b2fb5c5686037420bc0212163788beb68b87623</citedby><cites>FETCH-LOGICAL-c470t-cf0d6aea21d21bc42140aa0000b2fb5c5686037420bc0212163788beb68b87623</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/4669534$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,778,782,794,27907,27908,54741</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/4669534$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/19022733$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Jianke Zhu</creatorcontrib><creatorcontrib>Hoi, S.</creatorcontrib><creatorcontrib>Lyu, M.R.-T.</creatorcontrib><title>Robust Regularized Kernel Regression</title><title>IEEE transactions on cybernetics</title><addtitle>TSMCB</addtitle><addtitle>IEEE Trans Syst Man Cybern B Cybern</addtitle><description>Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Computer Simulation</subject><subject>Cybernetics</subject><subject>Data Interpretation, Statistical</subject><subject>Data mining</subject><subject>Equivalence</subject><subject>Formulations</subject><subject>History</subject><subject>Kernel</subject><subject>Kernel regression</subject><subject>Kernels</subject><subject>Least squares approximation</subject><subject>Least squares methods</subject><subject>Mathematical models</subject><subject>Mathematics</subject><subject>Models, Statistical</subject><subject>Noise robustness</subject><subject>Optimization</subject><subject>Pattern Recognition, Automated - methods</subject><subject>Regression</subject><subject>Regression Analysis</subject><subject>regularized least squares (RLS)</subject><subject>Resonance light scattering</subject><subject>robust estimator</subject><subject>Solvers</subject><subject>Statistics</subject><subject>Studies</subject><subject>support vector machine (SVM)</subject><subject>Support vector machines</subject><issn>1083-4419</issn><issn>2168-2267</issn><issn>1941-0492</issn><issn>2168-2275</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2008</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNp9kEtLAzEQgIMotlZ_gAgiIuJl60ySzeOoxRdWhFrPYbOblS3bbk26B_31pm5R8OBphplvZpiPkEOEISLoy-nL0-h6SAHUUFNJpd4ifdQcE-CabsccFEs4R90jeyHMAECDlrukhxoolYz1ydmksW1YnUzcW1tnvvp0xcmj8wtXr0vehVA1i32yU2Z1cAebOCCvtzfT0X0yfr57GF2Nk5xLWCV5CYXIXEaxoGhzTpFDlsWrYGlp0zwVSgCTnILNgSJFwaRS1lmhrJKCsgE57_YuffPeurAy8yrkrq6zhWvaYIRWyLhgEbz4F0QhkVGQQkf09A86a1q_iG8YFU-miFxGCDso900I3pVm6at55j8MglmrNt-qzVq16VTHmePN4tbOXfE7sXEbgaMOqJxzP20uhE4ZZ1-wo38W</recordid><startdate>20081201</startdate><enddate>20081201</enddate><creator>Jianke Zhu</creator><creator>Hoi, S.</creator><creator>Lyu, M.R.-T.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope></search><sort><creationdate>20081201</creationdate><title>Robust Regularized Kernel Regression</title><author>Jianke Zhu ; Hoi, S. ; Lyu, M.R.-T.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c470t-cf0d6aea21d21bc42140aa0000b2fb5c5686037420bc0212163788beb68b87623</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2008</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Computer Simulation</topic><topic>Cybernetics</topic><topic>Data Interpretation, Statistical</topic><topic>Data mining</topic><topic>Equivalence</topic><topic>Formulations</topic><topic>History</topic><topic>Kernel</topic><topic>Kernel regression</topic><topic>Kernels</topic><topic>Least squares approximation</topic><topic>Least squares methods</topic><topic>Mathematical models</topic><topic>Mathematics</topic><topic>Models, Statistical</topic><topic>Noise robustness</topic><topic>Optimization</topic><topic>Pattern Recognition, Automated - methods</topic><topic>Regression</topic><topic>Regression Analysis</topic><topic>regularized least squares (RLS)</topic><topic>Resonance light scattering</topic><topic>robust estimator</topic><topic>Solvers</topic><topic>Statistics</topic><topic>Studies</topic><topic>support vector machine (SVM)</topic><topic>Support vector machines</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Jianke Zhu</creatorcontrib><creatorcontrib>Hoi, S.</creatorcontrib><creatorcontrib>Lyu, M.R.-T.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Jianke Zhu</au><au>Hoi, S.</au><au>Lyu, M.R.-T.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Robust Regularized Kernel Regression</atitle><jtitle>IEEE transactions on cybernetics</jtitle><stitle>TSMCB</stitle><addtitle>IEEE Trans Syst Man Cybern B Cybern</addtitle><date>2008-12-01</date><risdate>2008</risdate><volume>38</volume><issue>6</issue><spage>1639</spage><epage>1644</epage><pages>1639-1644</pages><issn>1083-4419</issn><issn>2168-2267</issn><eissn>1941-0492</eissn><eissn>2168-2275</eissn><coden>ITSCFI</coden><abstract>Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>19022733</pmid><doi>10.1109/TSMCB.2008.927279</doi><tpages>6</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1083-4419 |
ispartof | IEEE transactions on cybernetics, 2008-12, Vol.38 (6), p.1639-1644 |
issn | 1083-4419 2168-2267 1941-0492 2168-2275 |
language | eng |
recordid | cdi_pubmed_primary_19022733 |
source | IEEE Electronic Library (IEL) |
subjects | Algorithms Artificial Intelligence Computer Simulation Cybernetics Data Interpretation, Statistical Data mining Equivalence Formulations History Kernel Kernel regression Kernels Least squares approximation Least squares methods Mathematical models Mathematics Models, Statistical Noise robustness Optimization Pattern Recognition, Automated - methods Regression Regression Analysis regularized least squares (RLS) Resonance light scattering robust estimator Solvers Statistics Studies support vector machine (SVM) Support vector machines |
title | Robust Regularized Kernel Regression |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-16T12%3A51%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Robust%20Regularized%20Kernel%20Regression&rft.jtitle=IEEE%20transactions%20on%20cybernetics&rft.au=Jianke%20Zhu&rft.date=2008-12-01&rft.volume=38&rft.issue=6&rft.spage=1639&rft.epage=1644&rft.pages=1639-1644&rft.issn=1083-4419&rft.eissn=1941-0492&rft.coden=ITSCFI&rft_id=info:doi/10.1109/TSMCB.2008.927279&rft_dat=%3Cproquest_RIE%3E69813463%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=862351147&rft_id=info:pmid/19022733&rft_ieee_id=4669534&rfr_iscdi=true |