An Efficient and Robust 3D Medical Image Classification Approach Based on 3D CNN, Time‐Distributed 2D CNN‐BLSTM Models, and mRMR Feature Selection

ABSTRACT The advent of 3D medical imaging has been a turning point in the diagnosis of various diseases, as voxel information from adjacent slices helps radiologists better understand complex anatomical relationships. However, the interpretation of medical images by radiologists with different level...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computational intelligence 2024-10, Vol.40 (5), p.n/a
Hauptverfasser: Akbacak, Enver, Muzoğlu, Nedim
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:ABSTRACT The advent of 3D medical imaging has been a turning point in the diagnosis of various diseases, as voxel information from adjacent slices helps radiologists better understand complex anatomical relationships. However, the interpretation of medical images by radiologists with different levels of expertise can vary and is also time‐consuming. In the last decades, artificial intelligence‐based computer‐aided systems have provided fast and more reliable diagnostic insights with great potential for various clinical purposes. This paper proposes a significant deep learning based 3D medical image diagnosis method. The method classifies MedMNIST3D, which consists of six 3D biomedical datasets obtained from CT, MRA, and electron microscopy modalities. The proposed method concatenates 3D image features extracted from three independent networks, a 3D CNN, and two time‐distributed ResNet BLSTM structures. The ultimate discriminative features are selected via the minimum redundancy maximum relevance (mRMR) feature selection method. Those features are then classified by a neural network model. Experiments adhere to the rules of the official splits and evaluation metrics of the MedMNIST3D datasets. The results reveal that the proposed approach outperforms similar studies in terms of accuracy and AUC.
ISSN:0824-7935
1467-8640
DOI:10.1111/coin.70000