Classification of Multiple Visual Field Defects using Deep Learning
In this work, a custom deep learning method is proposed to develop a detection of visual fields defects which are the markers for serious optic pathway disease. Convolutional Neural Networks (CNN) is a deep learning method that is mostly used in images processing. Therefore, a custom 10 layers of CN...
Gespeichert in:
Veröffentlicht in: | Journal of physics. Conference series 2021-02, Vol.1755 (1), p.12014 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this work, a custom deep learning method is proposed to develop a detection of visual fields defects which are the markers for serious optic pathway disease. Convolutional Neural Networks (CNN) is a deep learning method that is mostly used in images processing. Therefore, a custom 10 layers of CNN algorithm is built to detect the visual field defect. In this work, 1200 visual field defect images acquired from the Humphrey Field Analyzer 24–2 collected from Google Image have been used to classify 6 types of visual field defect. The defect patterns are including defects at central scotoma, right/left/upper/lower quadratopia, right/left hemianopia, vision tunnel, superior/inferior field defect and normal as baseline. The custom designed CNN is trained to discriminate between defect patterns in visual field images. In the proposed method, a mechanism of pre-processing is included to improve the classification of visual field defects. Then, the 6 visual field defect patterns are detected using a convolutional neural network. The dataset is evaluated using 5-fold cross-validation. The results of this work have shown that the proposed algorithm achieved a high classification rate with 96%. As comparison, traditional machine learning Support Vector Machine (SVM) and Classical Neural Network (NN) is chose and obtained classification rate at 74.54% and 90.72%. |
---|---|
ISSN: | 1742-6588 1742-6596 |
DOI: | 10.1088/1742-6596/1755/1/012014 |