ASVD: Activation-aware Singular Value Decomposition for Compressing Large Language Models
In this paper, we introduce a new post-training compression paradigm for Large Language Models (LLMs) to facilitate their wider adoption. We delve into LLM weight low-rank decomposition, and find that the challenges of this task stem from the distribution variance in the LLM activations and the sens...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we introduce a new post-training compression paradigm for
Large Language Models (LLMs) to facilitate their wider adoption. We delve into
LLM weight low-rank decomposition, and find that the challenges of this task
stem from the distribution variance in the LLM activations and the sensitivity
difference among various kinds of layers. To address these issues, we propose a
training-free approach called Activation-aware Singular Value Decomposition
(ASVD). Specifically, ASVD manages activation outliers by transforming the
weight matrix based on the activation distribution. This transformation allows
the outliers in the activation matrix to be absorbed into the transformed
weight matrix, thereby enhancing decomposition accuracy. Additionally, we
propose an efficient iterative calibration process to optimize layer-specific
decomposition by addressing the varying sensitivity of different LLM layers. In
this way, ASVD can compress a network by 10%-30%. Based on the success of the
low-rank decomposition of projection matrices in the self-attention module, we
further introduce ASVD to compress the KV cache. By reducing the channel
dimension of KV activations, memory requirements for KV cache can be largely
reduced. ASVD can further achieve 50% KV cache reductions without performance
drop in a training-free manner. Code is anonymously available in supplementary
materials. |
---|---|
DOI: | 10.48550/arxiv.2312.05821 |