G-T correcting: an improved training of image segmentation under noisy labels
Data-driven medical image segmentation networks require expert annotations, which are hard to obtain. Non-expert annotations are often used instead, but these can be inaccurate (referred to as “noisy labels”), misleading the network’s training and causing a decline in segmentation performance. In th...
Gespeichert in:
Veröffentlicht in: | Medical & biological engineering & computing 2024-12, Vol.62 (12), p.3781-3799 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Data-driven medical image segmentation networks require expert annotations, which are hard to obtain. Non-expert annotations are often used instead, but these can be inaccurate (referred to as “noisy labels”), misleading the network’s training and causing a decline in segmentation performance. In this study, we focus on improving the segmentation performance of neural networks when trained with noisy annotations. Specifically, we propose a two-stage framework named “G-T correcting,” consisting of “G” stage for recognizing noisy labels and “T” stage for correcting noisy labels. In the “G” stage, a positive feedback method is proposed to automatically recognize noisy samples, using a Gaussian mixed model to classify clean and noisy labels through the per-sample loss histogram. In the “T” stage, a confident correcting strategy and early learning strategy are adopted to allow the segmentation network to receive productive guidance from noisy labels. Experiments on simulated and real-world noisy labels show that this method can achieve over 90% accuracy in recognizing noisy labels, and improve the network’s DICE coefficient to 91%. The results demonstrate that the proposed method can enhance the segmentation performance of the network when trained with noisy labels, indicating good clinical application prospects.
Graphical Abstract |
---|---|
ISSN: | 0140-0118 1741-0444 1741-0444 |
DOI: | 10.1007/s11517-024-03170-4 |