A Novel Colorectal Histopathological Image Classification Method Based on Progressive Multi-Granularity Feature Fusion of Patch

Colorectal cancer (CRC) is a significant global health concern, ranking as the second most common cancer worldwide. Accurate classification of CRC is crucial for clinical practice and research. Deep learning-based methods have gained popularity in computer-aided CRC classification tasks. However, ex...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.68981-68998
Hauptverfasser: Cao, Zhengguang, Jia, Wei, Jiang, Haifeng, Zhao, Xuefen, Gao, Hongjuan, Si, Jialong, Shi, Chunhui
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Colorectal cancer (CRC) is a significant global health concern, ranking as the second most common cancer worldwide. Accurate classification of CRC is crucial for clinical practice and research. Deep learning-based methods have gained popularity in computer-aided CRC classification tasks. However, existing methods often overlook discriminative features at different local granularities, leading to suboptimal classification results. In this paper, we propose a novel Colorectal Histopathological Image Classification Method Based on Progressive Multi-granularity Feature Fusion of Patch (PMFF). Our method combines global features of CRC with features at different local granularities, enhancing the classification process. PMFF employs a progressive learning strategy to guide the model's attention towards information with locally different patch granularity at different stages, culminating in feature fusion at the final stage. The classification method encompasses an information communication mechanism between patches, a feature enhancement strategy, and a feature extraction network for the progressive learning strategy. We conducted evaluations on three public datasets, and the experimental results demonstrate that our method outperforms existing CRC classification methods, achieving classification accuracies of 96.6% and 92.3%, Precisions of 96.5% and 92.4%, Recalls of 96.3% and 92.3%, as well as F1-scores of 96.4% and 92.3%, respectively.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3401240