MitosisNet: End-to-End Mitotic Cell Detection by Multi-Task Learning
Mitotic cell detection is one of the challenging problems in the field of computational pathology. Currently, mitotic cell detection and counting are one of the strongest prognostic markers for breast cancer diagnosis. The clinical visual inspection on histology slides is tedious, error prone, and t...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.68695-68710 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Mitotic cell detection is one of the challenging problems in the field of computational pathology. Currently, mitotic cell detection and counting are one of the strongest prognostic markers for breast cancer diagnosis. The clinical visual inspection on histology slides is tedious, error prone, and time consuming for the pathologist. Thus, automatic mitotic cell detection approaches are highly demanded in clinical practice. In this paper, we propose an end-to-end multi-task learning system for mitosis detection from pathological images which is named "MitosisNet". MitosisNet consist of segmentation, detection, and classification models where the segmentation, and detection models are used for mitosis reference region detection and the classification model is applied for further confirmation of the mitosis regions. In addition, an integrated multi-patch reference scheme and a novel confidence analysis strategy are introduced for improving overall detection performance during testing. The proposed system is evaluated on three different publicly available datasets including MITOSIS 2012, MITOSIS 2014, and Case Western Reserve University (CWRU) datasets. The experimental results demonstrate state- of-the-art performance compared to the existing methods and the proposed approach is fast enough in order to meet the requirements of clinical practice. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.2983995 |