Use of artificial intelligence to recover mandibular morphology after disease

Mandibular tumors and radical oral cancer surgery often cause bone dysmorphia and defects. Most patients present with noticeable mandibular deformations, and doctors often have difficulty determining their exact mandibular morphology. In this study, a deep convolutional generative adversarial networ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific reports 2020-10, Vol.10 (1), p.16431-16431, Article 16431
Hauptverfasser: Liang, Ye, Huan, JingJing, Li, Jia-Da, Jiang, CanHua, Fang, ChangYun, Liu, YongGang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Mandibular tumors and radical oral cancer surgery often cause bone dysmorphia and defects. Most patients present with noticeable mandibular deformations, and doctors often have difficulty determining their exact mandibular morphology. In this study, a deep convolutional generative adversarial network (DCGAN) called CTGAN is proposed to complete 3D mandibular cone beam computed tomography data from CT data. After extensive training, CTGAN was tested on 6 mandibular tumor cases, resulting in 3D virtual mandibular completion. We found that CTGAN can generate mandibles with different levels and rich morphology, including positional and angular changes and local patterns. The completion results are shown as tomographic images combining generated and natural areas. The 3D generated mandibles have the anatomical morphology of the real mandibles and transition smoothly to the portions without disease, showing that CTGAN constructs mandibles with the expected patient characteristics and is suitable for mandibular morphological completion. The presented modeling principles can be applied to other areas for 3D morphological completion from medical images. Clinical trial registration : This study is not a clinical trial. Patient data were only used for testing in a virtual environment. The use of the digital data used in this study was ethically approved.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-020-73394-5