Worst sources and robust codes for difference distortion measures
It has long been known that for a mean-square error distortion measure the Gaussian distribution requires the largest rate of all sources of a given variance. It has also been stated that a code designed for the Gaussian source and yielding distortion d when used with a Gaussian source will yield di...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on information theory 1975-05, Vol.21 (3), p.301-309 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | It has long been known that for a mean-square error distortion measure the Gaussian distribution requires the largest rate of all sources of a given variance. It has also been stated that a code designed for the Gaussian source and yielding distortion d when used with a Gaussian source will yield distortion \leq d when used with any independent-letter source of the same variance. In this paper, we extend these results in two directions: a) instead of assuming that the source has a fixed variance, we fix an arbitrary moment; b) instead of mean-square error distortion measures, we consider nearly arbitrary continuous difference distortion measures. For each moment constraint, we show that there is a given distribution that has the largest rate for (nearly) any difference distortion measure and that a code designed for this source yielding distortion d yields distortion \leq d for any ergodic source satisfying the same moment constraint. Furthermore, digital encoding of the output of this encoder may yield a lower rate when this encoder is used with a source for which it was not designed. We also extend these results to the case of a random process or random field of known correlation function under a difference distortion measure. |
---|---|
ISSN: | 0018-9448 1557-9654 |
DOI: | 10.1109/TIT.1975.1055375 |