Compression of hyperspectral imagery

High dimensional source vectors, such as those that occur in hyperspectral imagery, are partitioned into a number of subvectors of different length and then each subvector is vector quantized (VQ) individually with an appropriate codebook. A locally adaptive partitioning algorithm is introduced that...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Motta, G., Rizzo, F., Storer, J.A.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:High dimensional source vectors, such as those that occur in hyperspectral imagery, are partitioned into a number of subvectors of different length and then each subvector is vector quantized (VQ) individually with an appropriate codebook. A locally adaptive partitioning algorithm is introduced that performs comparably in this application to a more expensive globally optimal one that employs dynamic programming. The VQ indices are entropy coded and used to condition the lossless or near-lossless coding of the residual error. Motivated by the need for maintaining uniform quality across all vector components, a percentage maximum absolute error distortion measure is employed. Experiments on the lossless and near-lossless compression of NASA AVIRIS images are presented. A key advantage of the approach is the use of independent small VQ codebooks that allow fast encoding and decoding.
ISSN:1068-0314
2375-0359
DOI:10.1109/DCC.2003.1194024