Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning

Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time. Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when up...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2023-01, Vol.25, p.1-13
Hauptverfasser: Bai, Yan, Jiao, Jile, Lou, Yihang, Wu, Shengsen, Liu, Jun, Feng, Xuetao, Duan, Ling-Yu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 13
container_issue
container_start_page 1
container_title IEEE transactions on multimedia
container_volume 25
creator Bai, Yan
Jiao, Jile
Lou, Yihang
Wu, Shengsen
Liu, Jun
Feng, Xuetao
Duan, Ling-Yu
description Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time. Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when updating the deployed model, we can bypass the inflexible and time-consuming feature re-extraction process. However, the old feature space that needs to be compatible is not ideal and faces outlier samples. Besides, the new and old models may be supervised by different losses, which will further causes distribution discrepancy problem between these two feature spaces. In this work, we propose a global optimization Dual-Tuning method to obtain feature compatibility against different networks and losses. A feature-level prototype loss is proposed to explicitly align two types of embedding features, by transferring global prototype information. Furthermore, we design a component-level mutual structural regularization to implicitly optimize the feature intrinsic structure. Experiments are conducted on six datasets, including person ReID datasets, face recognition datasets, and million-scale ImageNet and Place365. Experimental results demonstrate that our Dual-Tuning is able to obtain feature compatibility without sacrificing performance.
doi_str_mv 10.1109/TMM.2022.3219680
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9939072</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9939072</ieee_id><sourcerecordid>2887111654</sourcerecordid><originalsourceid>FETCH-LOGICAL-c244t-81643404a363ad836f84d958883d61c241b76c8346293d17f692315f9eb765093</originalsourceid><addsrcrecordid>eNo9kElPwzAQRi0EEmW5I3GxxDllxnYcmxsqlEWtQBDOlps4Vao0Lk5yKL8el1acZtH7ZqRHyBXCGBH0bT6fjxkwNuYMtVRwREaoBSYAWXYc-5RBohnCKTnruhUAihSyEVk8DLZJ8qGt2-UdffV129P34HvfbzeO5sG2XeUCtW1JP_swFP0QHP1wy6Gxof6xfe1bWvlAJ369idOicXTq7B81czbszl6Qk8o2nbs81HPyNX3MJ8_J7O3pZXI_SwomRJ8olIILEJZLbkvFZaVEqVOlFC8lRgYXmSwUF5JpXmJWSc04ppV2cZ-C5ufkZn93E_z34LrerPwQ2vjSMKUyRJSpiBTsqSL4rguuMptQr23YGgSzM2miSbMzaQ4mY-R6H6mdc_-41lxDxvgviahuIg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2887111654</pqid></control><display><type>article</type><title>Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning</title><source>IEEE Electronic Library (IEL)</source><creator>Bai, Yan ; Jiao, Jile ; Lou, Yihang ; Wu, Shengsen ; Liu, Jun ; Feng, Xuetao ; Duan, Ling-Yu</creator><creatorcontrib>Bai, Yan ; Jiao, Jile ; Lou, Yihang ; Wu, Shengsen ; Liu, Jun ; Feng, Xuetao ; Duan, Ling-Yu</creatorcontrib><description>Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time. Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when updating the deployed model, we can bypass the inflexible and time-consuming feature re-extraction process. However, the old feature space that needs to be compatible is not ideal and faces outlier samples. Besides, the new and old models may be supervised by different losses, which will further causes distribution discrepancy problem between these two feature spaces. In this work, we propose a global optimization Dual-Tuning method to obtain feature compatibility against different networks and losses. A feature-level prototype loss is proposed to explicitly align two types of embedding features, by transferring global prototype information. Furthermore, we design a component-level mutual structural regularization to implicitly optimize the feature intrinsic structure. Experiments are conducted on six datasets, including person ReID datasets, face recognition datasets, and million-scale ImageNet and Place365. Experimental results demonstrate that our Dual-Tuning is able to obtain feature compatibility without sacrificing performance.</description><identifier>ISSN: 1520-9210</identifier><identifier>EISSN: 1941-0077</identifier><identifier>DOI: 10.1109/TMM.2022.3219680</identifier><identifier>CODEN: ITMUF8</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Compatibility ; Compatible feature learning ; Datasets ; Face recognition ; Faces ; Feature extraction ; Global optimization ; Optimization ; Outliers (statistics) ; prototype transfer ; Prototypes ; Regularization ; structure regularization ; Training ; Tuning ; Visualization</subject><ispartof>IEEE transactions on multimedia, 2023-01, Vol.25, p.1-13</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c244t-81643404a363ad836f84d958883d61c241b76c8346293d17f692315f9eb765093</cites><orcidid>0000-0002-4491-2023 ; 0000-0002-2152-9611 ; 0000-0002-4365-4165 ; 0000-0003-2644-717X ; 0000-0002-8143-389X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9939072$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9939072$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Bai, Yan</creatorcontrib><creatorcontrib>Jiao, Jile</creatorcontrib><creatorcontrib>Lou, Yihang</creatorcontrib><creatorcontrib>Wu, Shengsen</creatorcontrib><creatorcontrib>Liu, Jun</creatorcontrib><creatorcontrib>Feng, Xuetao</creatorcontrib><creatorcontrib>Duan, Ling-Yu</creatorcontrib><title>Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning</title><title>IEEE transactions on multimedia</title><addtitle>TMM</addtitle><description>Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time. Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when updating the deployed model, we can bypass the inflexible and time-consuming feature re-extraction process. However, the old feature space that needs to be compatible is not ideal and faces outlier samples. Besides, the new and old models may be supervised by different losses, which will further causes distribution discrepancy problem between these two feature spaces. In this work, we propose a global optimization Dual-Tuning method to obtain feature compatibility against different networks and losses. A feature-level prototype loss is proposed to explicitly align two types of embedding features, by transferring global prototype information. Furthermore, we design a component-level mutual structural regularization to implicitly optimize the feature intrinsic structure. Experiments are conducted on six datasets, including person ReID datasets, face recognition datasets, and million-scale ImageNet and Place365. Experimental results demonstrate that our Dual-Tuning is able to obtain feature compatibility without sacrificing performance.</description><subject>Compatibility</subject><subject>Compatible feature learning</subject><subject>Datasets</subject><subject>Face recognition</subject><subject>Faces</subject><subject>Feature extraction</subject><subject>Global optimization</subject><subject>Optimization</subject><subject>Outliers (statistics)</subject><subject>prototype transfer</subject><subject>Prototypes</subject><subject>Regularization</subject><subject>structure regularization</subject><subject>Training</subject><subject>Tuning</subject><subject>Visualization</subject><issn>1520-9210</issn><issn>1941-0077</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kElPwzAQRi0EEmW5I3GxxDllxnYcmxsqlEWtQBDOlps4Vao0Lk5yKL8el1acZtH7ZqRHyBXCGBH0bT6fjxkwNuYMtVRwREaoBSYAWXYc-5RBohnCKTnruhUAihSyEVk8DLZJ8qGt2-UdffV129P34HvfbzeO5sG2XeUCtW1JP_swFP0QHP1wy6Gxof6xfe1bWvlAJ369idOicXTq7B81czbszl6Qk8o2nbs81HPyNX3MJ8_J7O3pZXI_SwomRJ8olIILEJZLbkvFZaVEqVOlFC8lRgYXmSwUF5JpXmJWSc04ppV2cZ-C5ufkZn93E_z34LrerPwQ2vjSMKUyRJSpiBTsqSL4rguuMptQr23YGgSzM2miSbMzaQ4mY-R6H6mdc_-41lxDxvgviahuIg</recordid><startdate>20230101</startdate><enddate>20230101</enddate><creator>Bai, Yan</creator><creator>Jiao, Jile</creator><creator>Lou, Yihang</creator><creator>Wu, Shengsen</creator><creator>Liu, Jun</creator><creator>Feng, Xuetao</creator><creator>Duan, Ling-Yu</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-4491-2023</orcidid><orcidid>https://orcid.org/0000-0002-2152-9611</orcidid><orcidid>https://orcid.org/0000-0002-4365-4165</orcidid><orcidid>https://orcid.org/0000-0003-2644-717X</orcidid><orcidid>https://orcid.org/0000-0002-8143-389X</orcidid></search><sort><creationdate>20230101</creationdate><title>Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning</title><author>Bai, Yan ; Jiao, Jile ; Lou, Yihang ; Wu, Shengsen ; Liu, Jun ; Feng, Xuetao ; Duan, Ling-Yu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c244t-81643404a363ad836f84d958883d61c241b76c8346293d17f692315f9eb765093</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Compatibility</topic><topic>Compatible feature learning</topic><topic>Datasets</topic><topic>Face recognition</topic><topic>Faces</topic><topic>Feature extraction</topic><topic>Global optimization</topic><topic>Optimization</topic><topic>Outliers (statistics)</topic><topic>prototype transfer</topic><topic>Prototypes</topic><topic>Regularization</topic><topic>structure regularization</topic><topic>Training</topic><topic>Tuning</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bai, Yan</creatorcontrib><creatorcontrib>Jiao, Jile</creatorcontrib><creatorcontrib>Lou, Yihang</creatorcontrib><creatorcontrib>Wu, Shengsen</creatorcontrib><creatorcontrib>Liu, Jun</creatorcontrib><creatorcontrib>Feng, Xuetao</creatorcontrib><creatorcontrib>Duan, Ling-Yu</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on multimedia</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Bai, Yan</au><au>Jiao, Jile</au><au>Lou, Yihang</au><au>Wu, Shengsen</au><au>Liu, Jun</au><au>Feng, Xuetao</au><au>Duan, Ling-Yu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning</atitle><jtitle>IEEE transactions on multimedia</jtitle><stitle>TMM</stitle><date>2023-01-01</date><risdate>2023</risdate><volume>25</volume><spage>1</spage><epage>13</epage><pages>1-13</pages><issn>1520-9210</issn><eissn>1941-0077</eissn><coden>ITMUF8</coden><abstract>Visual retrieval system faces frequent model update and deployment. It is a heavy workload to re-extract features of the whole database every time. Feature compatibility enables the learned new visual features to be directly compared with the old features stored in the database. In this way, when updating the deployed model, we can bypass the inflexible and time-consuming feature re-extraction process. However, the old feature space that needs to be compatible is not ideal and faces outlier samples. Besides, the new and old models may be supervised by different losses, which will further causes distribution discrepancy problem between these two feature spaces. In this work, we propose a global optimization Dual-Tuning method to obtain feature compatibility against different networks and losses. A feature-level prototype loss is proposed to explicitly align two types of embedding features, by transferring global prototype information. Furthermore, we design a component-level mutual structural regularization to implicitly optimize the feature intrinsic structure. Experiments are conducted on six datasets, including person ReID datasets, face recognition datasets, and million-scale ImageNet and Place365. Experimental results demonstrate that our Dual-Tuning is able to obtain feature compatibility without sacrificing performance.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TMM.2022.3219680</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-4491-2023</orcidid><orcidid>https://orcid.org/0000-0002-2152-9611</orcidid><orcidid>https://orcid.org/0000-0002-4365-4165</orcidid><orcidid>https://orcid.org/0000-0003-2644-717X</orcidid><orcidid>https://orcid.org/0000-0002-8143-389X</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1520-9210
ispartof IEEE transactions on multimedia, 2023-01, Vol.25, p.1-13
issn 1520-9210
1941-0077
language eng
recordid cdi_ieee_primary_9939072
source IEEE Electronic Library (IEL)
subjects Compatibility
Compatible feature learning
Datasets
Face recognition
Faces
Feature extraction
Global optimization
Optimization
Outliers (statistics)
prototype transfer
Prototypes
Regularization
structure regularization
Training
Tuning
Visualization
title Dual-Tuning: Joint Prototype Transfer and Structure Regularization for Compatible Feature Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T12%3A27%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Dual-Tuning:%20Joint%20Prototype%20Transfer%20and%20Structure%20Regularization%20for%20Compatible%20Feature%20Learning&rft.jtitle=IEEE%20transactions%20on%20multimedia&rft.au=Bai,%20Yan&rft.date=2023-01-01&rft.volume=25&rft.spage=1&rft.epage=13&rft.pages=1-13&rft.issn=1520-9210&rft.eissn=1941-0077&rft.coden=ITMUF8&rft_id=info:doi/10.1109/TMM.2022.3219680&rft_dat=%3Cproquest_RIE%3E2887111654%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2887111654&rft_id=info:pmid/&rft_ieee_id=9939072&rfr_iscdi=true