Image Segmentation-Based Multi-Focus Image Fusion Through Multi-Scale Convolutional Neural Network

A decision map contains complete and clear information about the image to be fused, and detecting the decision map is crucial to various image fusion issues, especially multi-focus image fusion. Nevertheless, in an attempt to obtain an approving image fusion effect, it is necessary and always diffic...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2017-01, Vol.5, p.15750-15761
Hauptverfasser: Du, Chaoben, Gao, Shesheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A decision map contains complete and clear information about the image to be fused, and detecting the decision map is crucial to various image fusion issues, especially multi-focus image fusion. Nevertheless, in an attempt to obtain an approving image fusion effect, it is necessary and always difficult to obtain a decision map. In this paper, we address this problem with a novel image segmentation-based multi-focus image fusion algorithm, in which the task of detecting the decision map is treated as image segmentation between the focused and defocused regions in the source images. The proposed method achieves segmentation through a multi-scale convolutional neural network, which performs a multi-scale analysis on each input image to derive the respective feature maps on the region boundaries between the focused and defocused regions. The feature maps are then inter-fused to produce a fused feature map. Afterward, the fused map is post-processed using initial segmentation, morphological operation, and watershed to obtain the segmentation map/decision map. We illustrate that the decision map gained from the multi-scale convolutional neural network is trustworthy and that it can lead to high-quality fusion results. Experimental results evidently validate that the proposed algorithm can achieve an optimum fusion performance in light of both qualitative and quantitative evaluations.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2017.2735019