HPWO-LS-based deep learning approach with S-ROA-optimized optic cup segmentation for fundus image classification
Recently, automated retinal image processing has been considered a competitive field of research due to the low-accuracy results, complexity, and unacceptable outcomes associated with it. In this article, we proposed a novel approach for the classification of fundus images from different kinds of fu...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2021-08, Vol.33 (15), p.9677-9690 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, automated retinal image processing has been considered a competitive field of research due to the low-accuracy results, complexity, and unacceptable outcomes associated with it. In this article, we proposed a novel approach for the classification of fundus images from different kinds of fundus disorders. The original images are preprocessed in terms of noise and contrast enhancement by using the contrast limited adaptive histogram equalization method. The optic cup segmentation from the fundus images is effectively handled via the search and rescue optimization algorithm. After that, the color, texture, and shape-based gray-level co-occurrence matrix features are extracted. The hybrid particle swarm optimization with local search strategy improves the DNN parameter and the newly developed method is named as optimal DNN. The optimal DNN is used to classify whether the image is diabetic retinopathy, glaucoma, or age-related macular degeneration. Experimentally, different kinds of datasets such as STARE, Drishti, and RIM-One datasets with performance measure are validated. Finally, the proposed approaches demonstrate higher classification performances in terms of accuracy, specificity, sensitivity, precision, recall, and f-measure. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-021-05732-1 |