Spatial Stimuli Gradient Based Multifocus Image Fusion Using Multiple Sized Kernels

Multi-focus image fusion technique extracts the focused areas from all the source images and combines them into a new image which contains all focused objects. This paper proposes a spatial domain fusion scheme for multi-focus images by using multiple size kernels. Firstly, source images are pre-pro...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Tehnički vjesnik 2021, Vol.28 (1), p.113-122
Hauptverfasser: Muzammil, Muhammad, Ali, Imdad, Javed, Umer, Amir, Muhammad, Ulhaq, Ihsan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Multi-focus image fusion technique extracts the focused areas from all the source images and combines them into a new image which contains all focused objects. This paper proposes a spatial domain fusion scheme for multi-focus images by using multiple size kernels. Firstly, source images are pre-processed with a contrast enhancement step and then the soft and hard decision maps are generated by employing a sliding window technique using multiple sized kernels on the gradient images. Hard decision map selects the accurate focus information from the source images, whereas, the soft decision map selects the basic focus information and contains minimum falsely detected focused/unfocused regions. These decision maps are further processed to compute the final focus map. Gradient images are constructed through state-of-the-art edge detection technique, spatial stimuli gradient sketch model, which computes the local stimuli from perceived brightness and hence enhances the essential structural and edge information. Detailed experiment results demonstrate that the proposed multi-focus image fusion algorithm performs better than the other well known state-of-the-art multi-focus image fusion methods, in terms of subjective visual perception and objective quality evaluation metrics.
ISSN:1330-3651
1848-6339
DOI:10.17559/TV-20191030075647