An efficient vector quantizer providing globally optimal solutions

This paper presents a new approach in vector quantization that is designed for clustering or source coding. It incorporates both the capability of fast convergence from a monotonically descending algorithm and provides a globally optimal solution by a random optimization technique. Thus, it benefits...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on signal processing 1998-09, Vol.46 (9), p.2515-2529
Hauptverfasser: Moller, U., Galicki, M., Baresova, E., Witte, H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a new approach in vector quantization that is designed for clustering or source coding. It incorporates both the capability of fast convergence from a monotonically descending algorithm and provides a globally optimal solution by a random optimization technique. Thus, it benefits from properties of deterministic and stochastic search. Comprehensive experiments demonstrate that the new algorithm actually assimilated the advantages of the both components. It may be therefore regarded as an accelerated global optimization method whose convergence is theoretically proved. According to the complexity of the quantization problem, the convergence rate is shown (numerically) to approach that of a coordinate descent algorithm, which is an iterative updating of a single codevector at a time (generalized Lloyd algorithm GLA, i.e., K-means). The new method is investigated and compared with GLA and a globally operating stochastic relaxation technique. The comparison was made with respect to quality, reliability, and efficiency and applied to four categories of data: an easy to grasp example, patterns derived from the EEG, Gauss-Markov, and image sources.
ISSN:1053-587X
1941-0476
DOI:10.1109/78.709539