Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone

Parameter-efficient tuning has become a trend in transferring large-scale foundation models to downstream applications. Existing methods typically embed some light-weight tuners into the backbone, where both the design and the learning of the tuners are highly dependent on the base model. This work...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2023-10
Hauptverfasser: Jiang, Zeyinzi, Mao, Chaojie, Huang, Ziyuan, Ao Ma, Lv, Yiliang, Shen, Yujun, Zhao, Deli, Zhou, Jingren
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Jiang, Zeyinzi
Mao, Chaojie
Huang, Ziyuan
Ao Ma
Lv, Yiliang
Shen, Yujun
Zhao, Deli
Zhou, Jingren
description Parameter-efficient tuning has become a trend in transferring large-scale foundation models to downstream applications. Existing methods typically embed some light-weight tuners into the backbone, where both the design and the learning of the tuners are highly dependent on the base model. This work offers a new tuning paradigm, dubbed Res-Tuning, which intentionally unbinds tuners from the backbone. With both theoretical and empirical evidence, we show that popular tuning approaches have their equivalent counterparts under our unbinding formulation, and hence can be integrated into our framework effortlessly. Thanks to the structural disentanglement, we manage to free the design of tuners from the network architecture, facilitating flexible combination of various tuning strategies. We further propose a memory-efficient variant of Res-Tuning, where the bypass i.e., formed by a sequence of tuners) is effectively detached from the main branch, such that the gradients are back-propagated only to the tuners but not to the backbone. Such a detachment also allows one-time backbone forward for multi-task inference. Extensive experiments on both discriminative and generative tasks demonstrate the superiority of our method over existing alternatives from the perspectives of efficacy and efficiency. Project page: \(\href{https://res-tuning.github.io/}{\textit{https://res-tuning.github.io/}}\).
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2884924638</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2884924638</sourcerecordid><originalsourceid>FETCH-proquest_journals_28849246383</originalsourceid><addsrcrecordid>eNqNitEKgjAUQEcQJOU_XOhZsE1t9Vah9BhhvcrUKTO9q02jz8-oD-jpwDlnQhzK2MrjAaUz4lrb-L5PozUNQ-aQ61laLx1QYb2FHSStfKm8lSCwhLiqVKEk9vAd4CSMKFXdwVMJuGCusPzosUoDldEd7EVxyzXKBZlWorXS_XFOlkmcHo7e3ejHIG2fNXowOKaMch5saBAxzv673mFcQCg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2884924638</pqid></control><display><type>article</type><title>Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone</title><source>Free E- Journals</source><creator>Jiang, Zeyinzi ; Mao, Chaojie ; Huang, Ziyuan ; Ao Ma ; Lv, Yiliang ; Shen, Yujun ; Zhao, Deli ; Zhou, Jingren</creator><creatorcontrib>Jiang, Zeyinzi ; Mao, Chaojie ; Huang, Ziyuan ; Ao Ma ; Lv, Yiliang ; Shen, Yujun ; Zhao, Deli ; Zhou, Jingren</creatorcontrib><description>Parameter-efficient tuning has become a trend in transferring large-scale foundation models to downstream applications. Existing methods typically embed some light-weight tuners into the backbone, where both the design and the learning of the tuners are highly dependent on the base model. This work offers a new tuning paradigm, dubbed Res-Tuning, which intentionally unbinds tuners from the backbone. With both theoretical and empirical evidence, we show that popular tuning approaches have their equivalent counterparts under our unbinding formulation, and hence can be integrated into our framework effortlessly. Thanks to the structural disentanglement, we manage to free the design of tuners from the network architecture, facilitating flexible combination of various tuning strategies. We further propose a memory-efficient variant of Res-Tuning, where the bypass i.e., formed by a sequence of tuners) is effectively detached from the main branch, such that the gradients are back-propagated only to the tuners but not to the backbone. Such a detachment also allows one-time backbone forward for multi-task inference. Extensive experiments on both discriminative and generative tasks demonstrate the superiority of our method over existing alternatives from the perspectives of efficacy and efficiency. Project page: \(\href{https://res-tuning.github.io/}{\textit{https://res-tuning.github.io/}}\).</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Back propagation ; Tuners ; Tuning ; Weight reduction</subject><ispartof>arXiv.org, 2023-10</ispartof><rights>2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Jiang, Zeyinzi</creatorcontrib><creatorcontrib>Mao, Chaojie</creatorcontrib><creatorcontrib>Huang, Ziyuan</creatorcontrib><creatorcontrib>Ao Ma</creatorcontrib><creatorcontrib>Lv, Yiliang</creatorcontrib><creatorcontrib>Shen, Yujun</creatorcontrib><creatorcontrib>Zhao, Deli</creatorcontrib><creatorcontrib>Zhou, Jingren</creatorcontrib><title>Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone</title><title>arXiv.org</title><description>Parameter-efficient tuning has become a trend in transferring large-scale foundation models to downstream applications. Existing methods typically embed some light-weight tuners into the backbone, where both the design and the learning of the tuners are highly dependent on the base model. This work offers a new tuning paradigm, dubbed Res-Tuning, which intentionally unbinds tuners from the backbone. With both theoretical and empirical evidence, we show that popular tuning approaches have their equivalent counterparts under our unbinding formulation, and hence can be integrated into our framework effortlessly. Thanks to the structural disentanglement, we manage to free the design of tuners from the network architecture, facilitating flexible combination of various tuning strategies. We further propose a memory-efficient variant of Res-Tuning, where the bypass i.e., formed by a sequence of tuners) is effectively detached from the main branch, such that the gradients are back-propagated only to the tuners but not to the backbone. Such a detachment also allows one-time backbone forward for multi-task inference. Extensive experiments on both discriminative and generative tasks demonstrate the superiority of our method over existing alternatives from the perspectives of efficacy and efficiency. Project page: \(\href{https://res-tuning.github.io/}{\textit{https://res-tuning.github.io/}}\).</description><subject>Back propagation</subject><subject>Tuners</subject><subject>Tuning</subject><subject>Weight reduction</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNitEKgjAUQEcQJOU_XOhZsE1t9Vah9BhhvcrUKTO9q02jz8-oD-jpwDlnQhzK2MrjAaUz4lrb-L5PozUNQ-aQ61laLx1QYb2FHSStfKm8lSCwhLiqVKEk9vAd4CSMKFXdwVMJuGCusPzosUoDldEd7EVxyzXKBZlWorXS_XFOlkmcHo7e3ejHIG2fNXowOKaMch5saBAxzv673mFcQCg</recordid><startdate>20231030</startdate><enddate>20231030</enddate><creator>Jiang, Zeyinzi</creator><creator>Mao, Chaojie</creator><creator>Huang, Ziyuan</creator><creator>Ao Ma</creator><creator>Lv, Yiliang</creator><creator>Shen, Yujun</creator><creator>Zhao, Deli</creator><creator>Zhou, Jingren</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20231030</creationdate><title>Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone</title><author>Jiang, Zeyinzi ; Mao, Chaojie ; Huang, Ziyuan ; Ao Ma ; Lv, Yiliang ; Shen, Yujun ; Zhao, Deli ; Zhou, Jingren</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28849246383</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Back propagation</topic><topic>Tuners</topic><topic>Tuning</topic><topic>Weight reduction</topic><toplevel>online_resources</toplevel><creatorcontrib>Jiang, Zeyinzi</creatorcontrib><creatorcontrib>Mao, Chaojie</creatorcontrib><creatorcontrib>Huang, Ziyuan</creatorcontrib><creatorcontrib>Ao Ma</creatorcontrib><creatorcontrib>Lv, Yiliang</creatorcontrib><creatorcontrib>Shen, Yujun</creatorcontrib><creatorcontrib>Zhao, Deli</creatorcontrib><creatorcontrib>Zhou, Jingren</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Jiang, Zeyinzi</au><au>Mao, Chaojie</au><au>Huang, Ziyuan</au><au>Ao Ma</au><au>Lv, Yiliang</au><au>Shen, Yujun</au><au>Zhao, Deli</au><au>Zhou, Jingren</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone</atitle><jtitle>arXiv.org</jtitle><date>2023-10-30</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>Parameter-efficient tuning has become a trend in transferring large-scale foundation models to downstream applications. Existing methods typically embed some light-weight tuners into the backbone, where both the design and the learning of the tuners are highly dependent on the base model. This work offers a new tuning paradigm, dubbed Res-Tuning, which intentionally unbinds tuners from the backbone. With both theoretical and empirical evidence, we show that popular tuning approaches have their equivalent counterparts under our unbinding formulation, and hence can be integrated into our framework effortlessly. Thanks to the structural disentanglement, we manage to free the design of tuners from the network architecture, facilitating flexible combination of various tuning strategies. We further propose a memory-efficient variant of Res-Tuning, where the bypass i.e., formed by a sequence of tuners) is effectively detached from the main branch, such that the gradients are back-propagated only to the tuners but not to the backbone. Such a detachment also allows one-time backbone forward for multi-task inference. Extensive experiments on both discriminative and generative tasks demonstrate the superiority of our method over existing alternatives from the perspectives of efficacy and efficiency. Project page: \(\href{https://res-tuning.github.io/}{\textit{https://res-tuning.github.io/}}\).</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2023-10
issn 2331-8422
language eng
recordid cdi_proquest_journals_2884924638
source Free E- Journals
subjects Back propagation
Tuners
Tuning
Weight reduction
title Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T19%3A37%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Res-Tuning:%20A%20Flexible%20and%20Efficient%20Tuning%20Paradigm%20via%20Unbinding%20Tuner%20from%20Backbone&rft.jtitle=arXiv.org&rft.au=Jiang,%20Zeyinzi&rft.date=2023-10-30&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2884924638%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2884924638&rft_id=info:pmid/&rfr_iscdi=true