Wavelet transforms and neural networks for compression and recognition
Robust recognition for image and speech processing needs data compression that preserves features. To accomplish this, we have utilized the discrete wavelet transforms and the continuous wavelet transforms (CWT) together with artificial neural networks (ANN) to achieve automatic pattern recognition....
Gespeichert in:
Veröffentlicht in: | Neural networks 1996, Vol.9 (4), p.695-708 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Robust recognition for image and speech processing needs data compression that preserves features. To accomplish this, we have utilized the discrete wavelet transforms and the continuous wavelet transforms (CWT) together with artificial neural networks (ANN) to achieve automatic pattern recognition. Our approach is motivated by the mathematical analog of the CWT to the human hearing and visual systems, e.g., the so-called Mexican hat and Gabor functions, Gaussian window, respectively. We develop an ANN method to construct an optimum mother wavelet that can organize sensor input data in the multiresolution format that seems to become essential for brainstyle computing. In one realization, the architecture of our ANN is similar to that of a radial basis function approach, except that each node is a wavelet having three learnable parameters: weight W
ij, scale a, and shift b. The node is not a McCullouch-Pitts neuron but a “wave-on”. We still use a supervised learning conjugate gradient descent algorithm in these parameters to construct a “super-mother” wavelet from a superposition of a set of waveons-mother wavelets. Using these techniques, we can accomplish the signal-enhanced and feature-preserving compression, e.g., on the infrared images, that avoids the overtraining and overfitting that have plagued ANN's ability to generalize and abstract information. |
---|---|
ISSN: | 0893-6080 1879-2782 |
DOI: | 10.1016/0893-6080(95)00051-8 |