Segmentation of Lymph Nodes in Ultrasound Images Using U-Net Convolutional Neural Networks and Gabor-Based Anisotropic Diffusion
Purpose The automated segmentation of lymph nodes (LNs) in ultrasound images is challenging, largely because of speckle noise and echogenic hila. This paper proposes a fully automatic and accurate method for LN segmentation in ultrasound that overcomes these issues. Methods The proposed segmentation...
Gespeichert in:
Veröffentlicht in: | Journal of medical and biological engineering 2021-12, Vol.41 (6), p.942-952 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Purpose
The automated segmentation of lymph nodes (LNs) in ultrasound images is challenging, largely because of speckle noise and echogenic hila. This paper proposes a fully automatic and accurate method for LN segmentation in ultrasound that overcomes these issues.
Methods
The proposed segmentation method integrates diffusion-based despeckling, U-Net convolutional neural networks and morphological operations. First, the speckle noise is suppressed and the lymph node edges are enhanced using Gabor-based anisotropic diffusion (GAD). Then, a modified U-Net model is used to segment the LNs excluding any echogenic hila. Finally, morphological operations are undertaken to segment the entire LNs by filling in any regions occupied by echogenic hila.
Results
A total of 531 lymph nodes from 526 patients were segmented using the proposed method. Its segmentation performance was evaluated in terms of its accuracy, sensitivity, specificity, Jaccard similarity and Dice coefficient, for which it achieved values of 0.934, 0.939, 0.937, 0.763 and 0.865, respectively.
Conclusion
The proposed method automatically and accurately segments LNs in ultrasound images, enhancing the prospects of being able to undertake artificial intelligence (AI)-based diagnosis of lymph node diseases. |
---|---|
ISSN: | 1609-0985 2199-4757 |
DOI: | 10.1007/s40846-021-00670-8 |