New convolutional neural network model for screening and diagnosis of mammograms

Breast cancer is the most common cancer in women and poses a great threat to women's life and health. Mammography is an effective method for the diagnosis of breast cancer, but the results are largely limited by the clinical experience of radiologists. Therefore, the main purpose of this study...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:PloS one 2020-08, Vol.15 (8), p.e0237674-e0237674
Hauptverfasser: Zhang, Chen, Zhao, Jumin, Niu, Jing, Li, Dengao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Breast cancer is the most common cancer in women and poses a great threat to women's life and health. Mammography is an effective method for the diagnosis of breast cancer, but the results are largely limited by the clinical experience of radiologists. Therefore, the main purpose of this study is to perform two-stage classification (Normal/Abnormal and Benign/Malignancy) of two- view mammograms through convolutional neural network. In this study, we constructed a multi-view feature fusion network model for classification of mammograms from two views, and we proposed a multi-scale attention DenseNet as the backbone network for feature extraction. The model consists of two independent branches, which are used to extract the features of two mammograms from different views. Our work mainly focuses on the construction of multi-scale convolution module and attention module. The final experimental results show that the model has achieved good performance in both classification tasks. We used the DDSM database to evaluate the proposed method. The accuracy, sensitivity and AUC values of normal and abnormal mammograms classification were 94.92%, 96.52% and 94.72%, respectively. And the accuracy, sensitivity and AUC values of benign and malignant mammograms classification were 95.24%, 96.11% and 95.03%, respectively.
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0237674