Hybrid deep neural network with clustering algorithms for effective gliomas segmentation

Brain tumor detection is one of the most significant areas in the field of medical imaging. Gliomas are the most common primary malignant tumors in the brain. Accurate detection of Gliomas is a strenuous task due to their aggressiveness and diversified structure. In this paper, a hybrid Deep CNN is...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of system assurance engineering and management 2024-03, Vol.15 (3), p.964-980
Hauptverfasser: Sahoo, Akshya Kumar, Parida, Priyadarsan, Muralibabu, K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Brain tumor detection is one of the most significant areas in the field of medical imaging. Gliomas are the most common primary malignant tumors in the brain. Accurate detection of Gliomas is a strenuous task due to their aggressiveness and diversified structure. In this paper, a hybrid Deep CNN is proposed which is created by a pre-trained Resnet101 embedded with 12 new layers. The proposed method is executed in three stages in the first stage, the hybrid deep CNN is trained by BraTs 2020 and BraTs 2017 multi-parametric MRI (mPMRI) dataset to detect the whole tumors. Secondly, after the detection of the whole tumor, the tumor core and the edema regions are segmented using the Local center of mass (LCM) method. In the third stage, K-means clustering is implemented for extracting the tumor core and edema regions. The proposed method is applied to BraTs 2020 and BraTs 2017 image datasets taken from the center of the biomedical image computing and analysis (CBICA) portal. The accuracies of the method are calculated as 99.06%, 99.23%, and 99.5% for whole tumor, tumor core, and edema respectively. The average dice scores calculated in our approach are 84.66%, 78.98%, and 72.79% for whole tumor, edema, and tumor core respectively.
ISSN:0975-6809
0976-4348
DOI:10.1007/s13198-023-02183-w