MANS-Net: Multiple Attention-Based Nuclei Segmentation in Multi Organ Digital Cancer Histopathology Images

The segmentation of nuclei is critical in histopathology investigations. The segmentation of images of nuclei is difficult in variable clinical conditions. Some deep learning methods were recently proposed; however, these approaches rarely provide solutions to clinical challenges. Since most of the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.173530-173539
Hauptverfasser: Ahmad, Ibtihaj, Ul Islam, Zain, Riaz, Saleem, Xue, Fuzhong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The segmentation of nuclei is critical in histopathology investigations. The segmentation of images of nuclei is difficult in variable clinical conditions. Some deep learning methods were recently proposed; however, these approaches rarely provide solutions to clinical challenges. Since most of the information in histopathology images is in the color channels, and the remaining is in the spatial patterns, distributions, etc., these problems can be handled by considering all the associated features simultaneously. This work suggests a novel multiple attention-based model, MANS-Net, that utilizes channel, spatial, and transformer-based attention modules to address the problems mentioned above. MANS-Net efficiently learns color-based, spatial, rough, and granular features, improving nuclei segmentation. We report that MANS-Net significantly outperforms state-of-the-art segmentation algorithms. We achieve F1 score of 0.8457 for PanNuke dataset and F1 score of 0.8137 for Kumar dataset. We show that the suggested model is more lightweight than the state-of-the-art approaches. The suggested model will impact future works that depend on semantic segmentation, such as nuclei instance segmentation and nuclei categorization.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3502766