A Deep Learning System for Automated Quality Evaluation of Optic Disc Photographs in Neuro-Ophthalmic Disorders
The quality of ocular fundus photographs can affect the accuracy of the morphologic assessment of the optic nerve head (ONH), either by humans or by deep learning systems (DLS). In order to automatically identify ONH photographs of optimal quality, we have developed, trained, and tested a DLS, using...
Gespeichert in:
Veröffentlicht in: | Diagnostics (Basel) 2023-01, Vol.13 (1), p.160 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The quality of ocular fundus photographs can affect the accuracy of the morphologic assessment of the optic nerve head (ONH), either by humans or by deep learning systems (DLS). In order to automatically identify ONH photographs of optimal quality, we have developed, trained, and tested a DLS, using an international, multicentre, multi-ethnic dataset of 5015 ocular fundus photographs from 31 centres in 20 countries participating to the Brain and Optic Nerve Study with Artificial Intelligence (BONSAI). The reference standard in image quality was established by three experts who independently classified photographs as of "good", "borderline", or "poor" quality. The DLS was trained on 4208 fundus photographs and tested on an independent external dataset of 807 photographs, using a multi-class model, evaluated with a one-vs-rest classification strategy. In the external-testing dataset, the DLS could identify with excellent performance "good" quality photographs (AUC = 0.93 (95% CI, 0.91-0.95), accuracy = 91.4% (95% CI, 90.0-92.9%), sensitivity = 93.8% (95% CI, 92.5-95.2%), specificity = 75.9% (95% CI, 69.7-82.1%) and "poor" quality photographs (AUC = 1.00 (95% CI, 0.99-1.00), accuracy = 99.1% (95% CI, 98.6-99.6%), sensitivity = 81.5% (95% CI, 70.6-93.8%), specificity = 99.7% (95% CI, 99.6-100.0%). "Borderline" quality images were also accurately classified (AUC = 0.90 (95% CI, 0.88-0.93), accuracy = 90.6% (95% CI, 89.1-92.2%), sensitivity = 65.4% (95% CI, 56.6-72.9%), specificity = 93.4% (95% CI, 92.1-94.8%). The overall accuracy to distinguish among the three classes was 90.6% (95% CI, 89.1-92.1%), suggesting that this DLS could select optimal quality fundus photographs in patients with neuro-ophthalmic and neurological disorders affecting the ONH. |
---|---|
ISSN: | 2075-4418 2075-4418 |
DOI: | 10.3390/diagnostics13010160 |