Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines
This article presents a new deep cross-output knowledge transfer approach based on least-squares support vector machines, called DCOT-LS-SVMs. Its aim is to improve the generalizability of least-squares support vector machines (LS-SVMs) while avoiding the complicated parameter tuning process that oc...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cybernetics 2022-05, Vol.52 (5), p.3207-3220 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 3220 |
---|---|
container_issue | 5 |
container_start_page | 3207 |
container_title | IEEE transactions on cybernetics |
container_volume | 52 |
creator | Wang, Guanjin Choi, Kup-Sze Teoh, Jeremy Yuen-Chun Lu, Jie |
description | This article presents a new deep cross-output knowledge transfer approach based on least-squares support vector machines, called DCOT-LS-SVMs. Its aim is to improve the generalizability of least-squares support vector machines (LS-SVMs) while avoiding the complicated parameter tuning process that occurs in many kernel machines. The proposed approach has two significant characteristics: 1) DCOT-LS-SVMs is inspired by a stacked hierarchical architecture that combines several layer-by-layer LS-SVMs modules. The module that forms the higher layer has additional input features that consider the predictions from all previous modules and 2) cross-output knowledge transfer is used to leverage knowledge from the predictions of the previous module to improve the learning process in the current module. With this approach, the model's parameters, such as a tradeoff parameter C and a kernel width \delta , can be randomly assigned to each module in order to greatly simplify the learning process. Moreover, DCOT-LS-SVMs is able to autonomously and quickly decide the extent of the cross-output knowledge transfer between adjacent modules through a fast leave-one-out cross-validation strategy. In addition, we present an imbalanced version of DCOT-LS-SVMs, called IDCOT-LS-SVMs, given that imbalanced datasets are common in real-world scenarios. The effectiveness of the proposed approaches is demonstrated through a comparison with five comparative methods on UCI datasets and with a case study on the diagnosis of prostate cancer. |
doi_str_mv | 10.1109/TCYB.2020.3008963 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pubmed_primary_32780705</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9165021</ieee_id><sourcerecordid>2667016186</sourcerecordid><originalsourceid>FETCH-LOGICAL-c264t-394452fdc8f1a350bddf8b723b45ea26497f34b57c82225b1026f085df5f330d3</originalsourceid><addsrcrecordid>eNpdkU1v2zAMhoVhxVq0_QHDgEFAL7s4lShLlo9rtrXDUvSQtMBOgixTrdvEdvWBYf--DpLlMF5IkA9fEHwJ-cjZjHNWX67mv69mwIDNBGO6VuIdOQGudAFQyfeHWlXH5DzGZzaFnlq1_kCOBVSaVUyekOYb4kjnYYixuMtpzIn-6oc_a2wfka6C7aPHQO9j1z_SZbLuBdtimUJ2KQekC7QxFcvXbANGuszjOIREH9ClIdBb6566HuMZOfJ2HfF8n0_J_Y_vq_lNsbi7_jn_uigcqDIVoi5LCb512nMrJGva1uumAtGUEu2E1JUXZSMrpwFANpyB8kzL1ksvBGvFKfmy0x3D8JoxJrPposP12vY45GigFAJKxnk5oRf_oc9DDv10nQGlKsYV12qi-I5y2_cE9GYM3caGv4Yzs_XAbD0wWw_M3oNp5_NeOTcbbA8b_z4-AZ92QIeIh3HNlWTAxRvP5om0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2667016186</pqid></control><display><type>article</type><title>Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines</title><source>IEEE Electronic Library (IEL)</source><creator>Wang, Guanjin ; Choi, Kup-Sze ; Teoh, Jeremy Yuen-Chun ; Lu, Jie</creator><creatorcontrib>Wang, Guanjin ; Choi, Kup-Sze ; Teoh, Jeremy Yuen-Chun ; Lu, Jie</creatorcontrib><description><![CDATA[This article presents a new deep cross-output knowledge transfer approach based on least-squares support vector machines, called DCOT-LS-SVMs. Its aim is to improve the generalizability of least-squares support vector machines (LS-SVMs) while avoiding the complicated parameter tuning process that occurs in many kernel machines. The proposed approach has two significant characteristics: 1) DCOT-LS-SVMs is inspired by a stacked hierarchical architecture that combines several layer-by-layer LS-SVMs modules. The module that forms the higher layer has additional input features that consider the predictions from all previous modules and 2) cross-output knowledge transfer is used to leverage knowledge from the predictions of the previous module to improve the learning process in the current module. With this approach, the model's parameters, such as a tradeoff parameter <inline-formula> <tex-math notation="LaTeX">C </tex-math></inline-formula> and a kernel width <inline-formula> <tex-math notation="LaTeX">\delta </tex-math></inline-formula>, can be randomly assigned to each module in order to greatly simplify the learning process. Moreover, DCOT-LS-SVMs is able to autonomously and quickly decide the extent of the cross-output knowledge transfer between adjacent modules through a fast leave-one-out cross-validation strategy. In addition, we present an imbalanced version of DCOT-LS-SVMs, called IDCOT-LS-SVMs, given that imbalanced datasets are common in real-world scenarios. The effectiveness of the proposed approaches is demonstrated through a comparison with five comparative methods on UCI datasets and with a case study on the diagnosis of prostate cancer.]]></description><identifier>ISSN: 2168-2267</identifier><identifier>EISSN: 2168-2275</identifier><identifier>DOI: 10.1109/TCYB.2020.3008963</identifier><identifier>PMID: 32780705</identifier><identifier>CODEN: ITCEB8</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Cancer prediction ; Data models ; Datasets ; Kernel ; Kernel functions ; Knowledge management ; Knowledge transfer ; Learning ; Least squares ; Least-Squares Analysis ; Machine learning ; Modules ; Neural networks ; Process parameters ; Support Vector Machine ; Support vector machines ; support vector machines (SVMs) ; Training ; transfer learning</subject><ispartof>IEEE transactions on cybernetics, 2022-05, Vol.52 (5), p.3207-3220</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c264t-394452fdc8f1a350bddf8b723b45ea26497f34b57c82225b1026f085df5f330d3</citedby><cites>FETCH-LOGICAL-c264t-394452fdc8f1a350bddf8b723b45ea26497f34b57c82225b1026f085df5f330d3</cites><orcidid>0000-0002-5258-0532 ; 0000-0003-0690-4732 ; 0000-0003-0836-7088</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9165021$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9165021$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/32780705$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wang, Guanjin</creatorcontrib><creatorcontrib>Choi, Kup-Sze</creatorcontrib><creatorcontrib>Teoh, Jeremy Yuen-Chun</creatorcontrib><creatorcontrib>Lu, Jie</creatorcontrib><title>Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines</title><title>IEEE transactions on cybernetics</title><addtitle>TCYB</addtitle><addtitle>IEEE Trans Cybern</addtitle><description><![CDATA[This article presents a new deep cross-output knowledge transfer approach based on least-squares support vector machines, called DCOT-LS-SVMs. Its aim is to improve the generalizability of least-squares support vector machines (LS-SVMs) while avoiding the complicated parameter tuning process that occurs in many kernel machines. The proposed approach has two significant characteristics: 1) DCOT-LS-SVMs is inspired by a stacked hierarchical architecture that combines several layer-by-layer LS-SVMs modules. The module that forms the higher layer has additional input features that consider the predictions from all previous modules and 2) cross-output knowledge transfer is used to leverage knowledge from the predictions of the previous module to improve the learning process in the current module. With this approach, the model's parameters, such as a tradeoff parameter <inline-formula> <tex-math notation="LaTeX">C </tex-math></inline-formula> and a kernel width <inline-formula> <tex-math notation="LaTeX">\delta </tex-math></inline-formula>, can be randomly assigned to each module in order to greatly simplify the learning process. Moreover, DCOT-LS-SVMs is able to autonomously and quickly decide the extent of the cross-output knowledge transfer between adjacent modules through a fast leave-one-out cross-validation strategy. In addition, we present an imbalanced version of DCOT-LS-SVMs, called IDCOT-LS-SVMs, given that imbalanced datasets are common in real-world scenarios. The effectiveness of the proposed approaches is demonstrated through a comparison with five comparative methods on UCI datasets and with a case study on the diagnosis of prostate cancer.]]></description><subject>Cancer prediction</subject><subject>Data models</subject><subject>Datasets</subject><subject>Kernel</subject><subject>Kernel functions</subject><subject>Knowledge management</subject><subject>Knowledge transfer</subject><subject>Learning</subject><subject>Least squares</subject><subject>Least-Squares Analysis</subject><subject>Machine learning</subject><subject>Modules</subject><subject>Neural networks</subject><subject>Process parameters</subject><subject>Support Vector Machine</subject><subject>Support vector machines</subject><subject>support vector machines (SVMs)</subject><subject>Training</subject><subject>transfer learning</subject><issn>2168-2267</issn><issn>2168-2275</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNpdkU1v2zAMhoVhxVq0_QHDgEFAL7s4lShLlo9rtrXDUvSQtMBOgixTrdvEdvWBYf--DpLlMF5IkA9fEHwJ-cjZjHNWX67mv69mwIDNBGO6VuIdOQGudAFQyfeHWlXH5DzGZzaFnlq1_kCOBVSaVUyekOYb4kjnYYixuMtpzIn-6oc_a2wfka6C7aPHQO9j1z_SZbLuBdtimUJ2KQekC7QxFcvXbANGuszjOIREH9ClIdBb6566HuMZOfJ2HfF8n0_J_Y_vq_lNsbi7_jn_uigcqDIVoi5LCb512nMrJGva1uumAtGUEu2E1JUXZSMrpwFANpyB8kzL1ksvBGvFKfmy0x3D8JoxJrPposP12vY45GigFAJKxnk5oRf_oc9DDv10nQGlKsYV12qi-I5y2_cE9GYM3caGv4Yzs_XAbD0wWw_M3oNp5_NeOTcbbA8b_z4-AZ92QIeIh3HNlWTAxRvP5om0</recordid><startdate>20220501</startdate><enddate>20220501</enddate><creator>Wang, Guanjin</creator><creator>Choi, Kup-Sze</creator><creator>Teoh, Jeremy Yuen-Chun</creator><creator>Lu, Jie</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-5258-0532</orcidid><orcidid>https://orcid.org/0000-0003-0690-4732</orcidid><orcidid>https://orcid.org/0000-0003-0836-7088</orcidid></search><sort><creationdate>20220501</creationdate><title>Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines</title><author>Wang, Guanjin ; Choi, Kup-Sze ; Teoh, Jeremy Yuen-Chun ; Lu, Jie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c264t-394452fdc8f1a350bddf8b723b45ea26497f34b57c82225b1026f085df5f330d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Cancer prediction</topic><topic>Data models</topic><topic>Datasets</topic><topic>Kernel</topic><topic>Kernel functions</topic><topic>Knowledge management</topic><topic>Knowledge transfer</topic><topic>Learning</topic><topic>Least squares</topic><topic>Least-Squares Analysis</topic><topic>Machine learning</topic><topic>Modules</topic><topic>Neural networks</topic><topic>Process parameters</topic><topic>Support Vector Machine</topic><topic>Support vector machines</topic><topic>support vector machines (SVMs)</topic><topic>Training</topic><topic>transfer learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Guanjin</creatorcontrib><creatorcontrib>Choi, Kup-Sze</creatorcontrib><creatorcontrib>Teoh, Jeremy Yuen-Chun</creatorcontrib><creatorcontrib>Lu, Jie</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Guanjin</au><au>Choi, Kup-Sze</au><au>Teoh, Jeremy Yuen-Chun</au><au>Lu, Jie</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines</atitle><jtitle>IEEE transactions on cybernetics</jtitle><stitle>TCYB</stitle><addtitle>IEEE Trans Cybern</addtitle><date>2022-05-01</date><risdate>2022</risdate><volume>52</volume><issue>5</issue><spage>3207</spage><epage>3220</epage><pages>3207-3220</pages><issn>2168-2267</issn><eissn>2168-2275</eissn><coden>ITCEB8</coden><abstract><![CDATA[This article presents a new deep cross-output knowledge transfer approach based on least-squares support vector machines, called DCOT-LS-SVMs. Its aim is to improve the generalizability of least-squares support vector machines (LS-SVMs) while avoiding the complicated parameter tuning process that occurs in many kernel machines. The proposed approach has two significant characteristics: 1) DCOT-LS-SVMs is inspired by a stacked hierarchical architecture that combines several layer-by-layer LS-SVMs modules. The module that forms the higher layer has additional input features that consider the predictions from all previous modules and 2) cross-output knowledge transfer is used to leverage knowledge from the predictions of the previous module to improve the learning process in the current module. With this approach, the model's parameters, such as a tradeoff parameter <inline-formula> <tex-math notation="LaTeX">C </tex-math></inline-formula> and a kernel width <inline-formula> <tex-math notation="LaTeX">\delta </tex-math></inline-formula>, can be randomly assigned to each module in order to greatly simplify the learning process. Moreover, DCOT-LS-SVMs is able to autonomously and quickly decide the extent of the cross-output knowledge transfer between adjacent modules through a fast leave-one-out cross-validation strategy. In addition, we present an imbalanced version of DCOT-LS-SVMs, called IDCOT-LS-SVMs, given that imbalanced datasets are common in real-world scenarios. The effectiveness of the proposed approaches is demonstrated through a comparison with five comparative methods on UCI datasets and with a case study on the diagnosis of prostate cancer.]]></abstract><cop>United States</cop><pub>IEEE</pub><pmid>32780705</pmid><doi>10.1109/TCYB.2020.3008963</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0002-5258-0532</orcidid><orcidid>https://orcid.org/0000-0003-0690-4732</orcidid><orcidid>https://orcid.org/0000-0003-0836-7088</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2168-2267 |
ispartof | IEEE transactions on cybernetics, 2022-05, Vol.52 (5), p.3207-3220 |
issn | 2168-2267 2168-2275 |
language | eng |
recordid | cdi_pubmed_primary_32780705 |
source | IEEE Electronic Library (IEL) |
subjects | Cancer prediction Data models Datasets Kernel Kernel functions Knowledge management Knowledge transfer Learning Least squares Least-Squares Analysis Machine learning Modules Neural networks Process parameters Support Vector Machine Support vector machines support vector machines (SVMs) Training transfer learning |
title | Deep Cross-Output Knowledge Transfer Using Stacked-Structure Least-Squares Support Vector Machines |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T10%3A52%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20Cross-Output%20Knowledge%20Transfer%20Using%20Stacked-Structure%20Least-Squares%20Support%20Vector%20Machines&rft.jtitle=IEEE%20transactions%20on%20cybernetics&rft.au=Wang,%20Guanjin&rft.date=2022-05-01&rft.volume=52&rft.issue=5&rft.spage=3207&rft.epage=3220&rft.pages=3207-3220&rft.issn=2168-2267&rft.eissn=2168-2275&rft.coden=ITCEB8&rft_id=info:doi/10.1109/TCYB.2020.3008963&rft_dat=%3Cproquest_RIE%3E2667016186%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2667016186&rft_id=info:pmid/32780705&rft_ieee_id=9165021&rfr_iscdi=true |