Application of CNN for Detection and Localization of STEMI Using 12-Lead ECG Images

STEMI is the most severe type of Myocardial Infarction that causes death or disability. Previous studies among physicians and paramedics have shown that the accuracy of STEMI diagnosis by the 12-lead ECG is not sufficient. Thus, we propose a 2D-CNN model that can detect and locate the STEMI signals...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2022, Vol.10, p.38923-38930
Hauptverfasser: Kavak, Serkan, Chiu, Xian-Dong, Yen, Shi-Jim, Chen, Michael Yu-Chih
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:STEMI is the most severe type of Myocardial Infarction that causes death or disability. Previous studies among physicians and paramedics have shown that the accuracy of STEMI diagnosis by the 12-lead ECG is not sufficient. Thus, we propose a 2D-CNN model that can detect and locate the STEMI signals from 12-lead ECG images. The 2D-CNN model is trained as a binary classification with 540 ECG images (270 STEMI cases and 270 other ECG images), and it achieved 96.3% accuracy, 96.2% sensitivity, 89.4% precision, 0.926 F1-score, and 0.962 ROC-AUC scores for 537 testing images. The proposed model is compared with 10 different transfer learning models. The proposed model has the best accuracy, sensitivity, F1-score, and ROC-AUC score. Grad-CAM technique is used for localization of STEMI signals in ECG images. According to the comparisons, the proposed model is the most reliable one for localizing the STEMI signals. This localization builds trust for the model because the CNN model is not a black box anymore, we can see where the CNN model looks and decides. The result of localization can also be used for teaching inexperienced physicians and paramedics. Also, the proposed model can be helpful for the accurate diagnosis of STEMI with a quick response time for clinical practices.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2022.3165966