Automated diagnosis of breast cancer using multi-modal datasets: A deep convolution neural network based approach

•An automated breast cancer detection method is proposed.•The proposed deep CNN model requires less learnable parameters.•The CNN helps to learn discriminant features automatically from the mammogram and ultrasound images.•The model achieves greater classification accuracy over both mammogram and ul...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Biomedical signal processing and control 2022-01, Vol.71, p.102825, Article 102825
Hauptverfasser: Muduli, Debendra, Dash, Ratnakar, Majhi, Banshidhar
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•An automated breast cancer detection method is proposed.•The proposed deep CNN model requires less learnable parameters.•The CNN helps to learn discriminant features automatically from the mammogram and ultrasound images.•The model achieves greater classification accuracy over both mammogram and ultrasound images. This paper proposes a deep convolutional neural network (CNN) model for automated breast cancer classification from a different class of images, namely, mammograms and ultrasound. The model contains only five learnable layers: four convolutional layers and a fully connected layer. The model facilitates extracting prominent features automatically from the images with a smaller number of tunable parameters. Exhaustive simulation results on mammograms dataset, namely, MIAS, DDSM, and INbreast, as well as ultrasound datasets, namely, BUS-1 and BUS-2, depict that the suggested model outperforms the recent state-of-the-art schemes. Data augmentation technique has been employed to reduce overfitting and provide good generalization. The proposed CNN model achieves an accuracy of 96.55%, 90.68%, and 91.28% on MIAS, DDSM, and INbreast datasets, respectively. Similarly, the accuracies obtained are 100% and 89.73% on BUS-1 and BUS-2 datasets, respectively.
ISSN:1746-8094
1746-8108
DOI:10.1016/j.bspc.2021.102825