Multimodal prediction of breast cancer using radiogenomics and clinical trials with decision fusion
Multimodal analysis focuses on the internal and external manifestations of cancer cells to provide physicians, oncologists and surgeons with timely information on personalized diagnosis and treatment for patients. Decision fusion in multimodal analysis reduces manual intervention, and improves class...
Gespeichert in:
Veröffentlicht in: | Journal of intelligent & fuzzy systems 2023-01, Vol.44 (2), p.2863-2880 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Multimodal analysis focuses on the internal and external manifestations of cancer cells to provide physicians, oncologists and surgeons with timely information on personalized diagnosis and treatment for patients. Decision fusion in multimodal analysis reduces manual intervention, and improves classification accuracy facilitating doctors to make quick decisions. Genetic characteristics extracted on biopsies do not, however, provide details on adjacent cells. Images can only provide external observable details of cancer cells. While mammograms can detect breast cancer, region wise details can be obtained from ultrasound images. Hence, different types of imaging techniques are used. Features are extracted using the SelectKbest method in the Wisconsin Breast Cancer, Clinical and gene expression datasets. The features are extracted using Gray Level Co-occurrence Matrix from Histology, Mammogram and Sonogram images. For image datasets, the Convolution Neural Network (CNN) is used as a classifier. The combined features from clinical, gene expression and image datasets are used to train an Integrated Stacking Classifier. The integrated multimodal system’s effectiveness is shown by experimental findings. |
---|---|
ISSN: | 1064-1246 1875-8967 |
DOI: | 10.3233/JIFS-220633 |