Res-Tuning: A Flexible and Efficient Tuning Paradigm via Unbinding Tuner from Backbone
Parameter-efficient tuning has become a trend in transferring large-scale foundation models to downstream applications. Existing methods typically embed some light-weight tuners into the backbone, where both the design and the learning of the tuners are highly dependent on the base model. This work...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Parameter-efficient tuning has become a trend in transferring large-scale
foundation models to downstream applications. Existing methods typically embed
some light-weight tuners into the backbone, where both the design and the
learning of the tuners are highly dependent on the base model. This work offers
a new tuning paradigm, dubbed Res-Tuning, which intentionally unbinds tuners
from the backbone. With both theoretical and empirical evidence, we show that
popular tuning approaches have their equivalent counterparts under our
unbinding formulation, and hence can be integrated into our framework
effortlessly. Thanks to the structural disentanglement, we manage to free the
design of tuners from the network architecture, facilitating flexible
combination of various tuning strategies. We further propose a memory-efficient
variant of Res-Tuning, where the bypass i.e., formed by a sequence of tuners)
is effectively detached from the main branch, such that the gradients are
back-propagated only to the tuners but not to the backbone. Such a detachment
also allows one-time backbone forward for multi-task inference. Extensive
experiments on both discriminative and generative tasks demonstrate the
superiority of our method over existing alternatives from the perspectives of
efficacy and efficiency. Project page:
$\href{https://res-tuning.github.io/}{\textit{https://res-tuning.github.io/}}$. |
---|---|
DOI: | 10.48550/arxiv.2310.19859 |