Polyp segmentation with distraction separation
In clinical practice, automatic polyp segmentation in colonoscopy images is important for computer-aided clinical diagnosis of colorectal cancer. Existing polyp segmentation methods still suffer from the challenges of false positive/negative distractions to distinguish polyps and normal tissues. In...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2023-10, Vol.228, p.120434, Article 120434 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In clinical practice, automatic polyp segmentation in colonoscopy images is important for computer-aided clinical diagnosis of colorectal cancer. Existing polyp segmentation methods still suffer from the challenges of false positive/negative distractions to distinguish polyps and normal tissues. In this paper, we propose a novel Distraction Separation Network (DSNet) that mines potential polyp regions from the low-level semantic features while segregating background regions. To support the proposed framework, we propose two modules, including the neighbor fusion module (NFM) and the distraction separation module (DSM). The neighbor fusion module first integrates high-level features to obtain initial segmentation results as the prior guidance map. Guided by the prior results, multiple distraction separation modules are then employed to capture multi-scale contextual information for eliminating distraction. By separating distractions on different levels, DSNet can progressively refine segmentation results. Extensive experiments show that DSNet outperforms state-of-the-art methods on six challenging benchmark datasets.
•We propose a novel automatic method called DSNet for polyp segmentation.•Our DSNet can extract and remove both false positive and false negative distractions.•We evaluate the effectiveness of the proposed method on six public datasets.•The results show that the proposed method outperforms other state-of-the-art models. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2023.120434 |