An Information-Theoretic Regularizer for Lossy Neural Image Compression
Lossy image compression networks aim to minimize the latent entropy of images while adhering to specific distortion constraints. However, optimizing the neural network can be challenging due to its nature of learning quantized latent representations. In this paper, our key finding is that minimizing...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Lossy image compression networks aim to minimize the latent entropy of images
while adhering to specific distortion constraints. However, optimizing the
neural network can be challenging due to its nature of learning quantized
latent representations. In this paper, our key finding is that minimizing the
latent entropy is, to some extent, equivalent to maximizing the conditional
source entropy, an insight that is deeply rooted in information-theoretic
equalities. Building on this insight, we propose a novel structural
regularization method for the neural image compression task by incorporating
the negative conditional source entropy into the training objective, such that
both the optimization efficacy and the model's generalization ability can be
promoted. The proposed information-theoretic regularizer is interpretable,
plug-and-play, and imposes no inference overheads. Extensive experiments
demonstrate its superiority in regularizing the models and further squeezing
bits from the latent representation across various compression structures and
unseen domains. |
---|---|
DOI: | 10.48550/arxiv.2411.16727 |