Level set based shape prior and deep learning for image segmentation

Deep convolutional neural network can effectively extract hidden patterns in images and learn realistic image priors from the training set. And fully convolutional networks (FCNs) have achieved state-of-the-art performance in the image segmentation. However, these methods have the disadvantages of n...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2020-01, Vol.14 (1), p.183-191
Hauptverfasser: Han, Yongming, Zhang, Shuheng, Geng, Zhiqing, Wei, Qin, Ouyang, Zhi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep convolutional neural network can effectively extract hidden patterns in images and learn realistic image priors from the training set. And fully convolutional networks (FCNs) have achieved state-of-the-art performance in the image segmentation. However, these methods have the disadvantages of noise, boundary roughness and no prior shape. Therefore, this study proposes a level set with the deep prior method for the image segmentation based on the priors learned by FCNs. The FCNs can learn high-level semantic patterns from the training set. Also, the output of the FCNs represents the high-level semantic information as a probability map and the global affine transformation can obtain the optimal affine transformation of the intrinsic prior shape. Moreover, the improved level set method integrates the information of the original image, the probability map and the corrected prior shape to achieve the image segmentation. Compared with the traditional level set method of simple scenes, the proposed method solves the disadvantage of FCNs by using the high-level semantic information to segment images of complex scenes. Finally, Portrait data set are used to verify the effectiveness of the proposed method. The experimental results show that the proposed method can obtain more accurate segmentation results than the traditional FCNs.
ISSN:1751-9659
1751-9667
1751-9667
DOI:10.1049/iet-ipr.2018.6622