MSAmix-Net: Diabetic Retinopathy Classification

Diabetic retinopathy (DR) is a very common complication of diabetes that can lead to retinal damage, affecting vision. If not detected and diagnosed in time, it may result in visual impairment. With the development of deep learning, various automatic diagnosis models for DR have been proposed. Most...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.185757-185767
Hauptverfasser: Gao, Jianyun, Li, Shu, Chen, Yiwen, Xiang, Rongwu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Diabetic retinopathy (DR) is a very common complication of diabetes that can lead to retinal damage, affecting vision. If not detected and diagnosed in time, it may result in visual impairment. With the development of deep learning, various automatic diagnosis models for DR have been proposed. Most models are based on convolutional neural networks, but due to the small size of convolution kernels in shallow networks, the receptive field is limited, preventing the capture of global information. This limitation forces models to require more convolutional layers to expand the receptive field and capture global information related to DR. In this paper, we use a combined module of multiscale convolution and convolutional self-attention, and we adjust the module structure, allowing the model to have both long-range dependency modeling capability and CNN's multiscale local feature extraction capability, while having fewer model parameters than the selected comparison models. Our proposed MSAmix-Net achieved an accuracy of 82.3% in the five-class classification task on the combined dataset of APTOS-2019 and Messidor-2, and an accuracy of 93.9% in the three-class classification task on the DRAC-2022 dataset. The experiments demonstrate that the model can effectively capture and represent the complex features of diabetic retinopathy.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3506714