Identification of Pneumonia in Chest X-Ray Image Based on Transformer

The research of application models based on traditional convolutional neural networks has gradually entered the bottleneck period of performance improvement, and the improvement of chest X-ray image models has gradually become a difficult problem in the study. In this paper, the Swin Transformer is...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of antennas and propagation 2022-08, Vol.2022, p.1-8
Hauptverfasser: Ma, Yongjun, Lv, Wei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The research of application models based on traditional convolutional neural networks has gradually entered the bottleneck period of performance improvement, and the improvement of chest X-ray image models has gradually become a difficult problem in the study. In this paper, the Swin Transformer is introduced into the application model of pneumonia recognition in chest X-ray images, and it is optimized according to the characteristics of chest X-ray images. The experimental results based on the model in this paper are compared with those of the model built with the traditional convolutional neural network as the backbone network, and the accuracy of the model is proved to be greatly improved. After the comparison experiments on two different datasets, the experimental results show that the accuracy of the model in this paper improves from 76.3% to 87.3% and from 92.8% to 97.2%, respectively. The experiments show that the accuracy of image enhancement based on the features of chest X-ray images in this model will be higher than the accuracy without image enhancement. In the experiments of this paper, the identification decision factors in the chest X-ray images were extracted by grad-cam combined with a transformer to find the corresponding approximate lesion regions.
ISSN:1687-5869
1687-5877
DOI:10.1155/2022/5072666