Sensing data discrete wavelet fusion for pattern recognition with qualitative and quantitative measuring

Sensing data fusion has various types of real world applications in fields of weather forecasting, environmental surveillance, medical diagnosis, information assurance, space exploration and national security. Image fusion acts as a primary approach of data fusion. For similar images, some unique pa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zhengmao Ye, Mohamadian, H., Yongmao Ye
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Sensing data fusion has various types of real world applications in fields of weather forecasting, environmental surveillance, medical diagnosis, information assurance, space exploration and national security. Image fusion acts as a primary approach of data fusion. For similar images, some unique patterns occur within each individual one. There are some typical image fusion techniques, either area based or feature based The feature-based approach is efficient and robust to handle multi-sensor image fusion with little rotation or translation, or the image has to be aligned beforehand. The area-based approach has no strict requirement on rotation or translation, but lack of robustness. A combination of two approaches is thus required. In this article, wavelet fusion is presented to analyze the effect of image fusion. Except for qualitative measures, quantitative measures are also proposed to evaluate image fusion. In particular, 2D discrete wavelet transform is used to both decompose images and reconstruct original images using the approximation, horizontal detail, vertical detail and diagonal detail components from the input images. At the same time, quantitative measures are used to evaluate the quality of the 2D wavelet transform and wavelet fusion, where gray level energy, discrete entropy and relative entropy and mutual information are applied.
ISSN:2161-4393
1522-4899
2161-4407
DOI:10.1109/IJCNN.2008.4634320