An efficient black widow optimization-based faster R-CNN for classification of COVID-19 from CT images

The coronavirus diseases (COVID-19) are transmittable diseases which are caused by Severe Acute Respiratory Syndrome human coronavirus (SARS-CoV). This paper describes the identification of coronavirus disease infections and better treatments based on recent technology. The categorization and projec...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia systems 2024-04, Vol.30 (2), Article 108
Hauptverfasser: Vani, S., Malathi, P., Ramya, V. Jeya, Sriman, B., Saravanan, M., Srivel, R.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The coronavirus diseases (COVID-19) are transmittable diseases which are caused by Severe Acute Respiratory Syndrome human coronavirus (SARS-CoV). This paper describes the identification of coronavirus disease infections and better treatments based on recent technology. The categorization and projection of COVID-19 from the dataset of the most significant Computed tomography (CT) image features. The CT image databases are collected from the online access Kaggle database. The features are extracted by the CT images with the Gray-Level Co-occurrence Matrix (GLCM) feature extraction techniques. The selected features are then segmented using the Improved Whale Optimization and Moth Flame Optimization (IWOMFO) algorithm. Improved Whale Optimization and Moth Flame Optimization (IWOMFO) algorithm are utilized to calculate the feature selection for image segmentation, which increases the objective function. Accuracy, F1-score, sensitivity, and precision are the various parameters utilized for evaluating performance. The segmented features were classified using Black widow optimization with a faster recurrent neural network (BWOFRCNN) method. The proposed BWOFRCNN classifier achieves a maximum accuracy of about 98.78%, a sensitivity of about 97.58%, and a precision of about 96.95% when compared to other methods.
ISSN:0942-4962
1432-1882
DOI:10.1007/s00530-024-01281-4