An end‐to‐end joint learning scheme of image compression and quality enhancement with improved entropy minimization

Recently, learned image compression methods based on entropy minimization have achieved superior results compared with conventional image codecs such as BPG and JPEG2000. However, they leverage single Gaussian models, which have a limited ability to approximate various irregular distributions of tra...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ETRI journal 2024, 46(6), , pp.935-949
Hauptverfasser: Lee, Jooyoung, Cho, Seunghyun, Kim, Munchurl
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, learned image compression methods based on entropy minimization have achieved superior results compared with conventional image codecs such as BPG and JPEG2000. However, they leverage single Gaussian models, which have a limited ability to approximate various irregular distributions of transformed latent representations, resulting in suboptimal coding efficiency. Furthermore, existing methods focus on constructing effective entropy models, rather than utilizing modern architectural techniques. In this paper, we propose a novel joint learning scheme called JointIQ‐Net that incorporates image compression and quality enhancement technologies with improved entropy minimization based on a newly adopted Gaussian mixture model. We also exploit global context to estimate the distributions of latent representations precisely. The results of extensive experiments demonstrate that JointIQ‐Net achieves remarkable performance improvements in terms of coding efficiency compared with existing learned image compression methods and conventional codecs. To the best of our knowledge, ours is the first learned image compression method that outperforms VVC intra‐coding in terms of both PSNR and MS‐SSIM. The recent development of artificial neural networks has led to significant advancements in learned‐image compression (LIC) methods. However, current LIC approaches use only single Gaussian distribution models as entropy models, limiting coding efficiency, and only focus on image compression, overlooking modern quality enhancement architectures. Now, researchers have developed a new approach, called JointIQ‐Net, that combines image compression and quality enhancement. Moreover, it utilizes a novel Gaussian‐mixture model for entropy minimization, enhancing coding efficiency.
ISSN:1225-6463
2233-7326
DOI:10.4218/etrij.2023-0275