Empowering Ovarian Cancer Subtype Classification with Parallel Swin Transformers and WSI Imaging

Ovarian cancer constitutes a notable proportion of cancer-related mortalities among women. The diagnostic classification of ovarian cancer subtypes has demonstrated complexity, characterized by limited concordance among pathologists. Vision Transformer (ViT) models have emerged as the predominant ar...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International arab journal of information technology 2024-11, Vol.21 (6)
Hauptverfasser: ALkahla, Lubna, Saeed, Jwan, Hussein, Maher
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Ovarian cancer constitutes a notable proportion of cancer-related mortalities among women. The diagnostic classification of ovarian cancer subtypes has demonstrated complexity, characterized by limited concordance among pathologists. Vision Transformer (ViT) models have emerged as the predominant architecture in numerous computer vision applications, encompassing tasks such as image classification and cancer detection. Their success stems primarily from their capacity to integrate global contextual information through self-attention mechanisms during the learning process. However, the key issue with ViT is its compatibility with high-res images. Computation grows quadratically with image size, resulting in a large number of tokens and significant computational demands for self-attention. Swin Transformer (Swin-T) addresses this challenge by introducing two main concepts: hierarchical feature mapping and windowed attention transformation. This work presents a parallel implementation of Swin Transformers (Swin-Ts) that leverages the powerful feature extraction capabilities and aimed at classifying five subtypes within ovarian cancer utilizing Whole Slide Imaging (WSI) and it yielded average precision, recall, and F1-score metrics of 0.958, 0.964, and 0.96 correspondingly. The findings show that the proposed parallel Swin-Ts reduce the misclassification errors and improve medical image analysis robustness. Additionally, the suggested technique is promising for accurate and efficient ovarian carcinoma subtype categorization, with possible applicability to other cancers. Future research will integrate other data sources and validate the technique in various clinical contexts
ISSN:1683-3198
1683-3198
DOI:10.34028/iajit/21/6/5