Extended MDL principle for feature-based inductive transfer learning

Transfer learning provides a solution in real applications of how to learn a target task where a large amount of auxiliary data from source domains are given. Despite numerous research studies on this topic, few of them have a solid theoretical framework and are parameter-free. In this paper, we pro...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge and information systems 2013-05, Vol.35 (2), p.365-389
Hauptverfasser: Shao, Hao, Tong, Bin, Suzuki, Einoshin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 389
container_issue 2
container_start_page 365
container_title Knowledge and information systems
container_volume 35
creator Shao, Hao
Tong, Bin
Suzuki, Einoshin
description Transfer learning provides a solution in real applications of how to learn a target task where a large amount of auxiliary data from source domains are given. Despite numerous research studies on this topic, few of them have a solid theoretical framework and are parameter-free. In this paper, we propose an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Unlike conventional methods, our encoding measure is based on a theoretical background and has no parameter. To obtain useful features to be used in the target task, we design an enhanced encoding length by adopting a code book that stores useful information obtained from the source task. With the code book that builds connections between the source and the target tasks, our EMDLP is able to evaluate the inferiority of the results of transfer learning with the add sum of the code lengths of five components: those of the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. The proposed method inherits the nice property of the MDLP that elaborately evaluates the hypotheses and balances the simplicity of the hypotheses and the goodness-of-the-fit to the data. Extensive experiments using both synthetic and real data sets show that the proposed method provides a better performance in terms of the classification accuracy and is robust against noise.
doi_str_mv 10.1007/s10115-012-0505-x
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1349456845</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1349456845</sourcerecordid><originalsourceid>FETCH-LOGICAL-c445t-3f887730511c182824e15e601433dceecd54629618ca02ebe3df5c2cad0ba2f33</originalsourceid><addsrcrecordid>eNp1kF1LwzAUhoMoOKc_wLuCCN5Uc5ImaS9lmx8w8UavQ5aejI6unUkr89-b0iEieHUOnOd9OTyEXAK9BUrVXQAKIFIKLKWCinR_RCaUQZFyAHl82IErdUrOQthQCkoCTMh8se-wKbFMXubLZOerxla7GhPX-sSh6XqP6cqEeK-asrdd9YlJ500THPqkRuObqlmfkxNn6oAXhzkl7w-Lt9lTunx9fJ7dL1ObZaJLuctzpTgVABZylrMMQaCkkHFeWkRbikyyQkJuDWW4Ql46YZk1JV0Z5jifkpuxd-fbjx5Dp7dVsFjXpsG2Dxp4VmRC5pmI6NUfdNP2vonfRYrxQqpCDoUwUta3IXh0OhrYGv-lgerBqx696uhVD171PmauD80mWFO7KMNW4SfIFAMmII8cG7kwWF2j__XBv-XfySaGyw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1323967963</pqid></control><display><type>article</type><title>Extended MDL principle for feature-based inductive transfer learning</title><source>SpringerLink Journals - AutoHoldings</source><creator>Shao, Hao ; Tong, Bin ; Suzuki, Einoshin</creator><creatorcontrib>Shao, Hao ; Tong, Bin ; Suzuki, Einoshin</creatorcontrib><description>Transfer learning provides a solution in real applications of how to learn a target task where a large amount of auxiliary data from source domains are given. Despite numerous research studies on this topic, few of them have a solid theoretical framework and are parameter-free. In this paper, we propose an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Unlike conventional methods, our encoding measure is based on a theoretical background and has no parameter. To obtain useful features to be used in the target task, we design an enhanced encoding length by adopting a code book that stores useful information obtained from the source task. With the code book that builds connections between the source and the target tasks, our EMDLP is able to evaluate the inferiority of the results of transfer learning with the add sum of the code lengths of five components: those of the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. The proposed method inherits the nice property of the MDLP that elaborately evaluates the hypotheses and balances the simplicity of the hypotheses and the goodness-of-the-fit to the data. Extensive experiments using both synthetic and real data sets show that the proposed method provides a better performance in terms of the classification accuracy and is robust against noise.</description><identifier>ISSN: 0219-1377</identifier><identifier>EISSN: 0219-3116</identifier><identifier>DOI: 10.1007/s10115-012-0505-x</identifier><identifier>CODEN: KISNCR</identifier><language>eng</language><publisher>London: Springer-Verlag</publisher><subject>Analysis ; Applied sciences ; Artificial intelligence ; Colleges &amp; universities ; Computer programming ; Computer Science ; Computer science; control theory; systems ; Data Mining and Knowledge Discovery ; Data processing. List processing. Character string processing ; Database Management ; Datasets ; Encoding ; Exact sciences and technology ; Hypotheses ; Information Storage and Retrieval ; Information systems ; Information Systems and Communication Service ; Information Systems Applications (incl.Internet) ; IT in Business ; Learning ; Learning and adaptive systems ; Memory organisation. Data processing ; Regular Paper ; Software ; Stores ; Studies ; Tasks</subject><ispartof>Knowledge and information systems, 2013-05, Vol.35 (2), p.365-389</ispartof><rights>Springer-Verlag London Limited 2012</rights><rights>2014 INIST-CNRS</rights><rights>Springer-Verlag London 2013</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c445t-3f887730511c182824e15e601433dceecd54629618ca02ebe3df5c2cad0ba2f33</citedby><cites>FETCH-LOGICAL-c445t-3f887730511c182824e15e601433dceecd54629618ca02ebe3df5c2cad0ba2f33</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10115-012-0505-x$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10115-012-0505-x$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27923,27924,41487,42556,51318</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=27212518$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Shao, Hao</creatorcontrib><creatorcontrib>Tong, Bin</creatorcontrib><creatorcontrib>Suzuki, Einoshin</creatorcontrib><title>Extended MDL principle for feature-based inductive transfer learning</title><title>Knowledge and information systems</title><addtitle>Knowl Inf Syst</addtitle><description>Transfer learning provides a solution in real applications of how to learn a target task where a large amount of auxiliary data from source domains are given. Despite numerous research studies on this topic, few of them have a solid theoretical framework and are parameter-free. In this paper, we propose an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Unlike conventional methods, our encoding measure is based on a theoretical background and has no parameter. To obtain useful features to be used in the target task, we design an enhanced encoding length by adopting a code book that stores useful information obtained from the source task. With the code book that builds connections between the source and the target tasks, our EMDLP is able to evaluate the inferiority of the results of transfer learning with the add sum of the code lengths of five components: those of the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. The proposed method inherits the nice property of the MDLP that elaborately evaluates the hypotheses and balances the simplicity of the hypotheses and the goodness-of-the-fit to the data. Extensive experiments using both synthetic and real data sets show that the proposed method provides a better performance in terms of the classification accuracy and is robust against noise.</description><subject>Analysis</subject><subject>Applied sciences</subject><subject>Artificial intelligence</subject><subject>Colleges &amp; universities</subject><subject>Computer programming</subject><subject>Computer Science</subject><subject>Computer science; control theory; systems</subject><subject>Data Mining and Knowledge Discovery</subject><subject>Data processing. List processing. Character string processing</subject><subject>Database Management</subject><subject>Datasets</subject><subject>Encoding</subject><subject>Exact sciences and technology</subject><subject>Hypotheses</subject><subject>Information Storage and Retrieval</subject><subject>Information systems</subject><subject>Information Systems and Communication Service</subject><subject>Information Systems Applications (incl.Internet)</subject><subject>IT in Business</subject><subject>Learning</subject><subject>Learning and adaptive systems</subject><subject>Memory organisation. Data processing</subject><subject>Regular Paper</subject><subject>Software</subject><subject>Stores</subject><subject>Studies</subject><subject>Tasks</subject><issn>0219-1377</issn><issn>0219-3116</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp1kF1LwzAUhoMoOKc_wLuCCN5Uc5ImaS9lmx8w8UavQ5aejI6unUkr89-b0iEieHUOnOd9OTyEXAK9BUrVXQAKIFIKLKWCinR_RCaUQZFyAHl82IErdUrOQthQCkoCTMh8se-wKbFMXubLZOerxla7GhPX-sSh6XqP6cqEeK-asrdd9YlJ500THPqkRuObqlmfkxNn6oAXhzkl7w-Lt9lTunx9fJ7dL1ObZaJLuctzpTgVABZylrMMQaCkkHFeWkRbikyyQkJuDWW4Ql46YZk1JV0Z5jifkpuxd-fbjx5Dp7dVsFjXpsG2Dxp4VmRC5pmI6NUfdNP2vonfRYrxQqpCDoUwUta3IXh0OhrYGv-lgerBqx696uhVD171PmauD80mWFO7KMNW4SfIFAMmII8cG7kwWF2j__XBv-XfySaGyw</recordid><startdate>20130501</startdate><enddate>20130501</enddate><creator>Shao, Hao</creator><creator>Tong, Bin</creator><creator>Suzuki, Einoshin</creator><general>Springer-Verlag</general><general>Springer</general><general>Springer Nature B.V</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>0U~</scope><scope>1-H</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L.0</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope></search><sort><creationdate>20130501</creationdate><title>Extended MDL principle for feature-based inductive transfer learning</title><author>Shao, Hao ; Tong, Bin ; Suzuki, Einoshin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c445t-3f887730511c182824e15e601433dceecd54629618ca02ebe3df5c2cad0ba2f33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>Analysis</topic><topic>Applied sciences</topic><topic>Artificial intelligence</topic><topic>Colleges &amp; universities</topic><topic>Computer programming</topic><topic>Computer Science</topic><topic>Computer science; control theory; systems</topic><topic>Data Mining and Knowledge Discovery</topic><topic>Data processing. List processing. Character string processing</topic><topic>Database Management</topic><topic>Datasets</topic><topic>Encoding</topic><topic>Exact sciences and technology</topic><topic>Hypotheses</topic><topic>Information Storage and Retrieval</topic><topic>Information systems</topic><topic>Information Systems and Communication Service</topic><topic>Information Systems Applications (incl.Internet)</topic><topic>IT in Business</topic><topic>Learning</topic><topic>Learning and adaptive systems</topic><topic>Memory organisation. Data processing</topic><topic>Regular Paper</topic><topic>Software</topic><topic>Stores</topic><topic>Studies</topic><topic>Tasks</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shao, Hao</creatorcontrib><creatorcontrib>Tong, Bin</creatorcontrib><creatorcontrib>Suzuki, Einoshin</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>Global News &amp; ABI/Inform Professional</collection><collection>Trade PRO</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection (ProQuest)</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ABI/INFORM Professional Standard</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><jtitle>Knowledge and information systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Shao, Hao</au><au>Tong, Bin</au><au>Suzuki, Einoshin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Extended MDL principle for feature-based inductive transfer learning</atitle><jtitle>Knowledge and information systems</jtitle><stitle>Knowl Inf Syst</stitle><date>2013-05-01</date><risdate>2013</risdate><volume>35</volume><issue>2</issue><spage>365</spage><epage>389</epage><pages>365-389</pages><issn>0219-1377</issn><eissn>0219-3116</eissn><coden>KISNCR</coden><abstract>Transfer learning provides a solution in real applications of how to learn a target task where a large amount of auxiliary data from source domains are given. Despite numerous research studies on this topic, few of them have a solid theoretical framework and are parameter-free. In this paper, we propose an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Unlike conventional methods, our encoding measure is based on a theoretical background and has no parameter. To obtain useful features to be used in the target task, we design an enhanced encoding length by adopting a code book that stores useful information obtained from the source task. With the code book that builds connections between the source and the target tasks, our EMDLP is able to evaluate the inferiority of the results of transfer learning with the add sum of the code lengths of five components: those of the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. The proposed method inherits the nice property of the MDLP that elaborately evaluates the hypotheses and balances the simplicity of the hypotheses and the goodness-of-the-fit to the data. Extensive experiments using both synthetic and real data sets show that the proposed method provides a better performance in terms of the classification accuracy and is robust against noise.</abstract><cop>London</cop><pub>Springer-Verlag</pub><doi>10.1007/s10115-012-0505-x</doi><tpages>25</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0219-1377
ispartof Knowledge and information systems, 2013-05, Vol.35 (2), p.365-389
issn 0219-1377
0219-3116
language eng
recordid cdi_proquest_miscellaneous_1349456845
source SpringerLink Journals - AutoHoldings
subjects Analysis
Applied sciences
Artificial intelligence
Colleges & universities
Computer programming
Computer Science
Computer science
control theory
systems
Data Mining and Knowledge Discovery
Data processing. List processing. Character string processing
Database Management
Datasets
Encoding
Exact sciences and technology
Hypotheses
Information Storage and Retrieval
Information systems
Information Systems and Communication Service
Information Systems Applications (incl.Internet)
IT in Business
Learning
Learning and adaptive systems
Memory organisation. Data processing
Regular Paper
Software
Stores
Studies
Tasks
title Extended MDL principle for feature-based inductive transfer learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T11%3A14%3A59IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Extended%20MDL%20principle%20for%20feature-based%20inductive%20transfer%20learning&rft.jtitle=Knowledge%20and%20information%20systems&rft.au=Shao,%20Hao&rft.date=2013-05-01&rft.volume=35&rft.issue=2&rft.spage=365&rft.epage=389&rft.pages=365-389&rft.issn=0219-1377&rft.eissn=0219-3116&rft.coden=KISNCR&rft_id=info:doi/10.1007/s10115-012-0505-x&rft_dat=%3Cproquest_cross%3E1349456845%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1323967963&rft_id=info:pmid/&rfr_iscdi=true