Utilizing Convolutional Neural Networks for Image Classification and Securing Mobility of People With Physical and Mental Disabilities in Cloud Systems

Image recognition is widely used for detecting human obstructions and identifying people with disabilities. The accuracy of identifying images of handicapped people is powered by image classification techniques that are based on deep learning methodologies. Specifically, convolutional neural network...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.163730-163745
Hauptverfasser: Hababeh, Ismail, Mahameed, Ibrahim, Abdelhadi, Abdelhadi A., Barghash, Ahmad
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Image recognition is widely used for detecting human obstructions and identifying people with disabilities. The accuracy of identifying images of handicapped people is powered by image classification techniques that are based on deep learning methodologies. Specifically, convolutional neural networks are employed to improve image classification of people with mental and physical disabilities. In this research, images of people with different disabilities are used to extract hidden features that symbolize each disability. Three different deep learning image classifiers are built to classify images of people in wheelchairs, blind people, and people with Down syndrome. A security technique is developed that is based on multiprotocol label switching headers to secure the image mobility over cloud nodes. The proposed approach is validated by measuring the impact of the deep learning image classifiers on image classification and securing image mobility on cloud system performance. The experimental results show the effectiveness of the proposed approach in improving image prediction of disabled people and enhancing the performance of securing image mobility in cloud systems.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3020866