A High-Precision Classification Method of Mammary Cancer Based on Improved DenseNet Driven by an Attention Mechanism

Cancer is one of the major causes of human disease and death worldwide, and mammary cancer is one of the most common cancer types among women today. In this paper, we used the deep learning method to conduct a preliminary experiment on Breast Cancer Histopathological Database (BreakHis); BreakHis is...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational and mathematical methods in medicine 2022-05, Vol.2022, p.8585036-14
Hauptverfasser: Xu, Xuebin, An, Meijuan, Zhang, Jiada, Liu, Wei, Lu, Longbin
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cancer is one of the major causes of human disease and death worldwide, and mammary cancer is one of the most common cancer types among women today. In this paper, we used the deep learning method to conduct a preliminary experiment on Breast Cancer Histopathological Database (BreakHis); BreakHis is an open dataset. We propose a high-precision classification method of mammary based on an improved convolutional neural network on the BreakHis dataset. We proposed three different MFSCNET models according to the different insertion positions and the number of SE modules, respectively, MFSCNet A, MFSCNet B, and MFSCNet C. We carried out experiments on the BreakHis dataset. Through experimental comparison, especially, the MFSCNet A network model has obtained the best performance in the high-precision classification experiments of mammary cancer. The accuracy of dichotomy was 99.05% to 99.89%. The accuracy of multiclass classification ranges from 94.36% to approximately 98.41%.Therefore, it is proved that MFSCNet can accurately classify the mammary histological images and has a great application prospect in predicting the degree of tumor. Code will be made available on http://github.com/xiaoan-maker/MFSCNet.
ISSN:1748-670X
1748-6718
DOI:10.1155/2022/8585036