A Hybrid Deep Learning Framework for Automatic Detection of Brain Tumours Using Different Modalities
Nowadays, deep convolutional neural networks (DCNNs) are the focus of substantial research for classification and detection applications in medical image processing. However, the limited availability and unequal data distribution of publicly available datasets impede the broad use of DCNNs for medic...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on emerging topics in computational intelligence 2024-08, p.1-10 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Nowadays, deep convolutional neural networks (DCNNs) are the focus of substantial research for classification and detection applications in medical image processing. However, the limited availability and unequal data distribution of publicly available datasets impede the broad use of DCNNs for medical image processing. This work proposes a novel deep learning-based framework for efficient detection of brain tumors across different openly accessible datasets of different sizes and modalities of images. The introduction of a novel end-to-end Cumulative Learning Strategy (CLS) and Multi-Weighted New Loss (MWNL) function reduces the impact of unevenly distributed datasets. In the suggested framework, the DCNN model is incorporated with regularization, such as DropOut and DropBlock, to mitigate the problem of over-fitting. Furthermore, the suggested augmentation approach, Modified RandAugment, successfully deals with the issue of limited availability of data. Finally, the employment of K-nearest neighbor (KNN) improves the classification performance since it retains the benefits of both deep learning and machine learning. Moreover, the effectiveness of the proposed framework is also validated over small and imbalanced datasets. The proposed framework outperforms others with an accuracy of up to 99.70\%. |
---|---|
ISSN: | 2471-285X 2471-285X |
DOI: | 10.1109/TETCI.2024.3442889 |