Attention-Guided Pyramid Context Network for Polyp Segmentation in Colonoscopy Images

Recently, deep convolutional neural networks (C-NNs) have provided us an effective tool for automated polyp segmentation in colonoscopy images. However, most CNN-based methods do not fully consider the feature interaction among different layers and often cannot provide satisfactory segmentation perf...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on instrumentation and measurement 2023-01, Vol.72, p.1-1
Hauptverfasser: Yue, Guanghui, Li, Siying, Cong, Runmin, Zhou, Tianwei, Lei, Baiying, Wang, Tianfu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Recently, deep convolutional neural networks (C-NNs) have provided us an effective tool for automated polyp segmentation in colonoscopy images. However, most CNN-based methods do not fully consider the feature interaction among different layers and often cannot provide satisfactory segmentation performance. In this paper, a novel attention-guided pyramid context network (APCNet) is proposed for accurate and robust polyp segmentation in colonoscopy images. Specifically, considering that different network layers represent the polyp in different aspects, APCNet first extracts multi-layer features in a pyramid structure, then utilizes an attention-guided multi-layer aggregation strategy to refine the context features of each layer by utilizing the complementary information of different layers. To obtain abundant context features, APCNet employs a context extraction module that explores the context information of each layer via local information retainment and global information compaction. Through the top-down deep supervision, our APCNet implements a coarse-to-fine polyp segmentation and finally localizes the polyp region precisely. Extensive experiments on two in-domain and four out-of-domain experiments show that APCNet is comparable to 19 state-of-the-art methods. Moreover, it holds a more appropriate trade-off between effectiveness and computational complexity than these competing methods.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2023.3244219