Fully convolutional neural network with attention gate and fuzzy active contour model for skin lesion segmentation

This study proposes an approach for segmentation of skin lesions from dermoscopic images based on fully convolutional neural network and active contour model (ACM). The architecture of fully convolutional neural network (FCN) is adapted from the SegNet neural network. Particularly, the paper propose...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2022-04, Vol.81 (10), p.13979-13999
Hauptverfasser: Tran, Thi-Thao, Pham, Van-Truong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This study proposes an approach for segmentation of skin lesions from dermoscopic images based on fully convolutional neural network and active contour model (ACM). The architecture of fully convolutional neural network (FCN) is adapted from the SegNet neural network. Particularly, the paper proposes to use the skip connection architecture and integrate the additive attention gate (AG) into the SegNet architecture. So that the model can better handle the variation in shapes and sizes of desired objects and produce more accurate segmentation. In addition, the fuzzy energy-based shape distance is introduced to the loss function for minimizing the dissimilarity between the prediction and reference masks. Moreover, the fuzzy energy-based ACM, with contours initialized from the network predicted masks, is employed to further evolve the contour toward desired object boundary. The proposed model therefore can take the advantages of the neural network and the fuzzy ACM to build a fully automatic and robust approach for segmentation of skin lesions. The proposed approach is evaluated on the ISIC 2017 and PH2 challenge databases. Comparative results on the two databases show desired performances of the approach while compared to other state-of-the-arts.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-12413-1