Knowledge transfer via distillation from time and frequency domain for time series classification

Although deep learning has achieved great success on time series classification, two issues are unsolved. First, existing methods mainly extract features in the single domain only, which means that useful information in the specific domain cannot be used. Second, multi-domain learning usually leads...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2023, Vol.53 (2), p.1505-1516
Hauptverfasser: Ouyang, Kewei, Hou, Yi, Zhang, Ye, Ma, Chao, Zhou, Shilin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1516
container_issue 2
container_start_page 1505
container_title Applied intelligence (Dordrecht, Netherlands)
container_volume 53
creator Ouyang, Kewei
Hou, Yi
Zhang, Ye
Ma, Chao
Zhou, Shilin
description Although deep learning has achieved great success on time series classification, two issues are unsolved. First, existing methods mainly extract features in the single domain only, which means that useful information in the specific domain cannot be used. Second, multi-domain learning usually leads to an increase in the size of the model which makes it difficult to deploy on mobile devices. In this this study, a lightweight double-branch model, called Time Frequency Knowledge Reception Network (TFKR-Net) is proposed to simultaneously fuse information from the time and frequency domains. Instead of directly merging knowledge from the teacher models pretrained in different domains, TFKR-Net independently distills knowledge from the teacher models in the time and frequency domains, which helps maintain knowledge diversity. Experimental results on the UCR (University of California, Riverside) archive demonstrate that the TFKR-Net significantly reduces the model size and improves computational efficiency with a little performance loss in classification accuracy.
doi_str_mv 10.1007/s10489-022-03485-5
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2760352060</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2760352060</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-1f62fa4e6edac484290133d3043e9f08f0c681827db99609512c060e8339da493</originalsourceid><addsrcrecordid>eNp9kEtLAzEUhYMoWKt_wFXAdfTmMZnJUoovLLhRcBdiHiVlOqnJVPHfGzuCO1eXy_nOuZeD0DmFSwrQXhUKolMEGCPARdeQ5gDNaNNy0grVHqIZKCaIlOr1GJ2UsgYAzoHOkHkc0mfv3crjMZuhBJ_xRzTYxTLGvjdjTAMOOW3wGDcem8HVzb_v_GC_sEsbE6uc8qQWn6Mv2PamlBii3btP0VEwffFnv3OOXm5vnhf3ZPl097C4XhLLqRoJDZIFI7z0zljRCaaAcu44CO5VgC6AlR3tWOvelJKgGsosSPAd58oZofgcXUy525zqf2XU67TLQz2pWSuBN6zilWITZXMqJfugtzluTP7SFPRPlXqqUtcq9b5K3VQTn0ylwsPK57_of1zf0g53TQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2760352060</pqid></control><display><type>article</type><title>Knowledge transfer via distillation from time and frequency domain for time series classification</title><source>SpringerNature Journals</source><creator>Ouyang, Kewei ; Hou, Yi ; Zhang, Ye ; Ma, Chao ; Zhou, Shilin</creator><creatorcontrib>Ouyang, Kewei ; Hou, Yi ; Zhang, Ye ; Ma, Chao ; Zhou, Shilin</creatorcontrib><description>Although deep learning has achieved great success on time series classification, two issues are unsolved. First, existing methods mainly extract features in the single domain only, which means that useful information in the specific domain cannot be used. Second, multi-domain learning usually leads to an increase in the size of the model which makes it difficult to deploy on mobile devices. In this this study, a lightweight double-branch model, called Time Frequency Knowledge Reception Network (TFKR-Net) is proposed to simultaneously fuse information from the time and frequency domains. Instead of directly merging knowledge from the teacher models pretrained in different domains, TFKR-Net independently distills knowledge from the teacher models in the time and frequency domains, which helps maintain knowledge diversity. Experimental results on the UCR (University of California, Riverside) archive demonstrate that the TFKR-Net significantly reduces the model size and improves computational efficiency with a little performance loss in classification accuracy.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-022-03485-5</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Artificial Intelligence ; Classification ; Computer Science ; Deep learning ; Distillation ; Electronic devices ; Feature extraction ; Frequency domain analysis ; Knowledge management ; Machine learning ; Machines ; Manufacturing ; Mechanical Engineering ; Processes ; Teachers ; Time series</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2023, Vol.53 (2), p.1505-1516</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-1f62fa4e6edac484290133d3043e9f08f0c681827db99609512c060e8339da493</citedby><cites>FETCH-LOGICAL-c319t-1f62fa4e6edac484290133d3043e9f08f0c681827db99609512c060e8339da493</cites><orcidid>0000-0002-0991-4621</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-022-03485-5$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-022-03485-5$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Ouyang, Kewei</creatorcontrib><creatorcontrib>Hou, Yi</creatorcontrib><creatorcontrib>Zhang, Ye</creatorcontrib><creatorcontrib>Ma, Chao</creatorcontrib><creatorcontrib>Zhou, Shilin</creatorcontrib><title>Knowledge transfer via distillation from time and frequency domain for time series classification</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>Although deep learning has achieved great success on time series classification, two issues are unsolved. First, existing methods mainly extract features in the single domain only, which means that useful information in the specific domain cannot be used. Second, multi-domain learning usually leads to an increase in the size of the model which makes it difficult to deploy on mobile devices. In this this study, a lightweight double-branch model, called Time Frequency Knowledge Reception Network (TFKR-Net) is proposed to simultaneously fuse information from the time and frequency domains. Instead of directly merging knowledge from the teacher models pretrained in different domains, TFKR-Net independently distills knowledge from the teacher models in the time and frequency domains, which helps maintain knowledge diversity. Experimental results on the UCR (University of California, Riverside) archive demonstrate that the TFKR-Net significantly reduces the model size and improves computational efficiency with a little performance loss in classification accuracy.</description><subject>Artificial Intelligence</subject><subject>Classification</subject><subject>Computer Science</subject><subject>Deep learning</subject><subject>Distillation</subject><subject>Electronic devices</subject><subject>Feature extraction</subject><subject>Frequency domain analysis</subject><subject>Knowledge management</subject><subject>Machine learning</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Processes</subject><subject>Teachers</subject><subject>Time series</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kEtLAzEUhYMoWKt_wFXAdfTmMZnJUoovLLhRcBdiHiVlOqnJVPHfGzuCO1eXy_nOuZeD0DmFSwrQXhUKolMEGCPARdeQ5gDNaNNy0grVHqIZKCaIlOr1GJ2UsgYAzoHOkHkc0mfv3crjMZuhBJ_xRzTYxTLGvjdjTAMOOW3wGDcem8HVzb_v_GC_sEsbE6uc8qQWn6Mv2PamlBii3btP0VEwffFnv3OOXm5vnhf3ZPl097C4XhLLqRoJDZIFI7z0zljRCaaAcu44CO5VgC6AlR3tWOvelJKgGsosSPAd58oZofgcXUy525zqf2XU67TLQz2pWSuBN6zilWITZXMqJfugtzluTP7SFPRPlXqqUtcq9b5K3VQTn0ylwsPK57_of1zf0g53TQ</recordid><startdate>2023</startdate><enddate>2023</enddate><creator>Ouyang, Kewei</creator><creator>Hou, Yi</creator><creator>Zhang, Ye</creator><creator>Ma, Chao</creator><creator>Zhou, Shilin</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-0991-4621</orcidid></search><sort><creationdate>2023</creationdate><title>Knowledge transfer via distillation from time and frequency domain for time series classification</title><author>Ouyang, Kewei ; Hou, Yi ; Zhang, Ye ; Ma, Chao ; Zhou, Shilin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-1f62fa4e6edac484290133d3043e9f08f0c681827db99609512c060e8339da493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Artificial Intelligence</topic><topic>Classification</topic><topic>Computer Science</topic><topic>Deep learning</topic><topic>Distillation</topic><topic>Electronic devices</topic><topic>Feature extraction</topic><topic>Frequency domain analysis</topic><topic>Knowledge management</topic><topic>Machine learning</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Processes</topic><topic>Teachers</topic><topic>Time series</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ouyang, Kewei</creatorcontrib><creatorcontrib>Hou, Yi</creatorcontrib><creatorcontrib>Zhang, Ye</creatorcontrib><creatorcontrib>Ma, Chao</creatorcontrib><creatorcontrib>Zhou, Shilin</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Access via ABI/INFORM (ProQuest)</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ouyang, Kewei</au><au>Hou, Yi</au><au>Zhang, Ye</au><au>Ma, Chao</au><au>Zhou, Shilin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Knowledge transfer via distillation from time and frequency domain for time series classification</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2023</date><risdate>2023</risdate><volume>53</volume><issue>2</issue><spage>1505</spage><epage>1516</epage><pages>1505-1516</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>Although deep learning has achieved great success on time series classification, two issues are unsolved. First, existing methods mainly extract features in the single domain only, which means that useful information in the specific domain cannot be used. Second, multi-domain learning usually leads to an increase in the size of the model which makes it difficult to deploy on mobile devices. In this this study, a lightweight double-branch model, called Time Frequency Knowledge Reception Network (TFKR-Net) is proposed to simultaneously fuse information from the time and frequency domains. Instead of directly merging knowledge from the teacher models pretrained in different domains, TFKR-Net independently distills knowledge from the teacher models in the time and frequency domains, which helps maintain knowledge diversity. Experimental results on the UCR (University of California, Riverside) archive demonstrate that the TFKR-Net significantly reduces the model size and improves computational efficiency with a little performance loss in classification accuracy.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-022-03485-5</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0002-0991-4621</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0924-669X
ispartof Applied intelligence (Dordrecht, Netherlands), 2023, Vol.53 (2), p.1505-1516
issn 0924-669X
1573-7497
language eng
recordid cdi_proquest_journals_2760352060
source SpringerNature Journals
subjects Artificial Intelligence
Classification
Computer Science
Deep learning
Distillation
Electronic devices
Feature extraction
Frequency domain analysis
Knowledge management
Machine learning
Machines
Manufacturing
Mechanical Engineering
Processes
Teachers
Time series
title Knowledge transfer via distillation from time and frequency domain for time series classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T06%3A11%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Knowledge%20transfer%20via%20distillation%20from%20time%20and%20frequency%20domain%20for%20time%20series%20classification&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Ouyang,%20Kewei&rft.date=2023&rft.volume=53&rft.issue=2&rft.spage=1505&rft.epage=1516&rft.pages=1505-1516&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-022-03485-5&rft_dat=%3Cproquest_cross%3E2760352060%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2760352060&rft_id=info:pmid/&rfr_iscdi=true