Segmentation of metastatic cervical lymph nodes from CT images of oral cancers using deep-learning technology
The purpose of this study was to establish a deep-learning model for segmenting the cervical lymph nodes of oral cancer patients and diagnosing metastatic or non-metastatic lymph nodes from contrast-enhanced computed tomography (CT) images. CT images of 158 metastatic and 514 non-metastatic lymph no...
Gespeichert in:
Veröffentlicht in: | Dento-maxillo-facial radiology 2022-05, Vol.51 (4), p.20210515-20210515 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The purpose of this study was to establish a deep-learning model for segmenting the cervical lymph nodes of oral cancer patients and diagnosing metastatic or non-metastatic lymph nodes from contrast-enhanced computed tomography (CT) images.
CT images of 158 metastatic and 514 non-metastatic lymph nodes were prepared. CT images were assigned to training, validation, and test datasets. The colored images with lymph nodes were prepared together with the original images for the training and validation datasets. Learning was performed for 200 epochs using the neural network U-net. Performance in segmenting lymph nodes and diagnosing metastasis were obtained.
Performance in segmenting metastatic lymph nodes showed recall of 0.742, precision of 0.942, and F1 score of 0.831. The recall of metastatic lymph nodes at level II was 0.875, which was the highest value. The diagnostic performance of identifying metastasis showed an area under the curve (AUC) of 0.950, which was significantly higher than that of radiologists (0.896).
A deep-learning model was created to automatically segment the cervical lymph nodes of oral squamous cell carcinomas. Segmentation performances should still be improved, but the segmented lymph nodes were more accurately diagnosed for metastases compared with evaluation by humans. |
---|---|
ISSN: | 0250-832X 1476-542X |
DOI: | 10.1259/dmfr.20210515 |