Turbo: Informativity-Driven Acceleration Plug-In for Vision-Language Models
Vision-Language Large Models (VLMs) have become primary backbone of AI, due to the impressive performance. However, their expensive computation costs, i.e., throughput and delay, impede potentials in real-world scenarios. To achieve acceleration for VLMs, most existing methods focus on the model per...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Vision-Language Large Models (VLMs) have become primary backbone of AI, due
to the impressive performance. However, their expensive computation costs,
i.e., throughput and delay, impede potentials in real-world scenarios. To
achieve acceleration for VLMs, most existing methods focus on the model
perspective: pruning, distillation, quantification, but completely overlook the
data-perspective redundancy. To fill the overlook, this paper pioneers the
severity of data redundancy, and designs one plug-and-play Turbo module guided
by information degree to prune inefficient tokens from visual or textual data.
In pursuit of efficiency-performance trade-offs, information degree takes two
key factors into consideration: mutual redundancy and semantic value.
Concretely, the former evaluates the data duplication between sequential
tokens; while the latter evaluates each token by its contribution to the
overall semantics. As a result, tokens with high information degree carry less
redundancy and stronger semantics. For VLMs' calculation, Turbo works as a
user-friendly plug-in that sorts data referring to information degree,
utilizing only top-level ones to save costs. Its advantages are multifaceted,
e.g., being generally compatible to various VLMs across understanding and
generation, simple use without retraining and trivial engineering efforts. On
multiple public VLMs benchmarks, we conduct extensive experiments to reveal the
gratifying acceleration of Turbo, under negligible performance drop. |
---|---|
DOI: | 10.48550/arxiv.2312.07408 |