On Rate Distortion via Constrained Optimization of Estimated Mutual Information

We propose a new methodology for the estimation of the rate distortion function (RDF), considering both continuous and discrete reconstruction spaces. The approach is input-space agnostic and does not require prior knowledge of the source distribution, nor the distortion function. Thus, our method i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.137970-137987
Hauptverfasser: Tsur, Dor, Huleihel, Bashar, Permuter, Haim H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We propose a new methodology for the estimation of the rate distortion function (RDF), considering both continuous and discrete reconstruction spaces. The approach is input-space agnostic and does not require prior knowledge of the source distribution, nor the distortion function. Thus, our method is a general solution to the RDF estimation problem, while existing works focus on a specific domain. The approach leverages neural estimation and constrained optimization of mutual information to optimize a generative model of the input distribution. In continuous spaces we learn a sample generating model, while a probability mass function model is proposed for discrete spaces. Formal guarantees of the proposed method are explored and implementation details are discussed. We demonstrate our method's superior performance on both high dimensional and large alphabet synthetic data. In contrast to existing works, our estimator readily adapts to the rate distortion perception framework, which is central to contemporary compression tasks. Consequently, our method strengthens the connection between information theory and machine learning, proposing new solutions to the problem of lossy compression.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3462853