Evaluation of Radiograph Accuracy in Skull X-ray Images Using Deep Learning
Purpose: Accurate positioning is essential for radiography, and it is especially important to maintain image reproducibility in follow-up observations. The decision on re-taking radiographs is entrusting to the individual radiological technologist. The evaluation is a visual and qualitative evaluati...
Gespeichert in:
Veröffentlicht in: | Japanese Journal of Radiological Technology 2022/01/20, Vol.78(1), pp.23-32 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng ; jpn |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Purpose: Accurate positioning is essential for radiography, and it is especially important to maintain image reproducibility in follow-up observations. The decision on re-taking radiographs is entrusting to the individual radiological technologist. The evaluation is a visual and qualitative evaluation and there are individual variations in the acceptance criteria. In this study, we propose a method of image evaluation using a deep convolutional neural network (DCNN) for skull X-ray images. Method: The radiographs were obtained from 5 skull phantoms and were classified by simple network and VGG16. The discrimination ability of DCNN was verified by recognizing the X-ray projection angle and the retake of the radiograph. DCNN architectures were used with the different input image sizes and were evaluated by 5-fold cross-validation and leave-one-out cross-validation. Result: Using the 5-fold cross-validation, the classification accuracy was 99.75% for the simple network and 80.00% for the VGG16 in small input image sizes, and when the input image size was general image size, simple network and VGG16 showed 79.58% and 80.00%, respectively. Conclusion: The experimental results showed that the combination between the small input image size, and the shallow DCNN architecture was suitable for the four-category classification in X-ray projection angles. The classification accuracy was up to 99.75%. The proposed method has the potential to automatically recognize the slight projection angles and the re-taking images to the acceptance criteria. It is considered that our proposed method can contribute to feedback for re-taking images and to reduce radiation dose due to individual subjectivity. |
---|---|
ISSN: | 0369-4305 1881-4883 |
DOI: | 10.6009/jjrt.780104 |