Children’s dental panoramic radiographs dataset for caries segmentation and dental disease detection

When dentists see pediatric patients with more complex tooth development than adults during tooth replacement, they need to manually determine the patient’s disease with the help of preoperative dental panoramic radiographs. To the best of our knowledge, there is no international public dataset for...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific data 2023-06, Vol.10 (1), p.380-380, Article 380
Hauptverfasser: Zhang, Yifan, Ye, Fan, Chen, Lingxiao, Xu, Feng, Chen, Xiaodiao, Wu, Hongkun, Cao, Mingguo, Li, Yunxiang, Wang, Yaqi, Huang, Xingru
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:When dentists see pediatric patients with more complex tooth development than adults during tooth replacement, they need to manually determine the patient’s disease with the help of preoperative dental panoramic radiographs. To the best of our knowledge, there is no international public dataset for children’s teeth and only a few datasets for adults’ teeth, which limits the development of deep learning algorithms for segmenting teeth and automatically analyzing diseases. Therefore, we collected dental panoramic radiographs and cases from 106 pediatric patients aged 2 to 13 years old, and with the help of the efficient and intelligent interactive segmentation annotation software EISeg (Efficient Interactive Segmentation) and the image annotation software LabelMe. We propose the world’s first dataset of children’s dental panoramic radiographs for caries segmentation and dental disease detection by segmenting and detecting annotations. In addition, another 93 dental panoramic radiographs of pediatric patients, together with our three internationally published adult dental datasets with a total of 2,692 images, were collected and made into a segmentation dataset suitable for deep learning.
ISSN:2052-4463
2052-4463
DOI:10.1038/s41597-023-02237-5