Semi‐supervised breast histopathological image classification with self‐training based on non‐linear distance metric

Histopathological analysis requires a lot of clinical experience and time for pathologists. Artificial intelligence (AI) may have an important role in assisting pathologists and leading to more efficient and effective histopathological diagnoses. To address the challenge of requiring a large number...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET image processing 2022-10, Vol.16 (12), p.3164-3176
Hauptverfasser: Liu, Kun, Liu, Zhuolin, Liu, Sidong
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Histopathological analysis requires a lot of clinical experience and time for pathologists. Artificial intelligence (AI) may have an important role in assisting pathologists and leading to more efficient and effective histopathological diagnoses. To address the challenge of requiring a large number of labelled images to train deep learning models in breast cancer histopathological image classification, a self‐training semi‐supervised learning method consisting three components is proposed: Firstly, a pre‐trained ResNet‐18 was used to extract features and generate pseudo‐labels for unlabelled data; secondly, a relational weight network based on the squeeze‐and‐excitation network (SENet) was trained to calculate the non‐linear distance metrices between labelled and unlabelled samples, in order to improve the accuracy of pseudo‐labelling; lastly, a consistency loss—maximum mean difference (MMD)—was added into the model to minimize the divergence between distributions of unlabelled and labelled samples. Extensive experiments were conducted on the open access BreakHis dataset. The proposed method outperformed the state‐of‐the‐art semi‐supervised methods at all tested annotated percentages (10–70%), and also achieved comparable performance with supervised methods at higher annotated percentages (50%, 70%).
ISSN:1751-9659
1751-9667
DOI:10.1049/ipr2.12548