Soft Attention Based Efficientnetv2b3 Model for Skin Cancer's Disease Classification Using Dermoscopy Images

Skin cancer ranks as one of the most widespread and lethal types of cancer. Without early identification and intervention, there is a propensity for the disease to disperse among the different body parts. This primarily occurs because of the abnormal proliferation of skin cells, which is often trigg...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.161283-161295
Hauptverfasser: Ibrahim, Sally, Amin, Khalid M., Ibrahim Alkanhel, Reem, Abdallah, Hanaa A., Ibrahim, Mina
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Skin cancer ranks as one of the most widespread and lethal types of cancer. Without early identification and intervention, there is a propensity for the disease to disperse among the different body parts. This primarily occurs because of the abnormal proliferation of skin cells, which is often triggered by exposure to sunlight. Despite recent advancements in deep convolutional neural networks, there remains a difficulty in concentrating on the semantically meaningful aspects of a lesion. Our study introduces an innovative methodology to tackle this issue that couples deep learning techniques with a soft attention mechanism for feature aggregation, followed by classification layers to exploit the immense capability of the soft attention mechanism in amplifying the significance of crucial features while mitigating the impact of other irrelevant features within neural networks. Our proposed approach achieves a remarkable improvement of 3% in classification accuracy over the EfficientNetV2B3 model, 2.3% over InceptionV3, and 1.7% over InceptionResNetV2, compared to their original pre-trained counterparts on the benchmark ISIC Archive dataset. Our best model is EfficientNetV2B3, which achieved the highest accuracy of 95.6%.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3486153