Towards Discriminative Representation with Meta-learning for Colonoscopic Polyp Re-Identification
Colonoscopic Polyp Re-Identification aims to match the same polyp from a large gallery with images from different views taken using different cameras and plays an important role in the prevention and treatment of colorectal cancer in computer-aided diagnosis. However, traditional methods for object...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Colonoscopic Polyp Re-Identification aims to match the same polyp from a
large gallery with images from different views taken using different cameras
and plays an important role in the prevention and treatment of colorectal
cancer in computer-aided diagnosis. However, traditional methods for object
ReID directly adopting CNN models trained on the ImageNet dataset usually
produce unsatisfactory retrieval performance on colonoscopic datasets due to
the large domain gap. Additionally, these methods neglect to explore the
potential of self-discrepancy among intra-class relations in the colonoscopic
polyp dataset, which remains an open research problem in the medical community.
To solve this dilemma, we propose a simple but effective training method named
Colo-ReID, which can help our model learn more general and discriminative
knowledge based on the meta-learning strategy in scenarios with fewer samples.
Based on this, a dynamic Meta-Learning Regulation mechanism called MLR is
introduced to further boost the performance of polyp re-identification. To the
best of our knowledge, this is the first attempt to leverage the meta-learning
paradigm instead of traditional machine learning algorithm to effectively train
deep models in the task of colonoscopic polyp re-identification. Empirical
results show that our method significantly outperforms current state-of-the-art
methods by a clear margin. |
---|---|
DOI: | 10.48550/arxiv.2308.00929 |