Center Contrastive Loss for Metric Learning
Contrastive learning is a major studied topic in metric learning. However, sampling effective contrastive pairs remains a challenge due to factors such as limited batch size, imbalanced data distribution, and the risk of overfitting. In this paper, we propose a novel metric learning function called...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Contrastive learning is a major studied topic in metric learning. However,
sampling effective contrastive pairs remains a challenge due to factors such as
limited batch size, imbalanced data distribution, and the risk of overfitting.
In this paper, we propose a novel metric learning function called Center
Contrastive Loss, which maintains a class-wise center bank and compares the
category centers with the query data points using a contrastive loss. The
center bank is updated in real-time to boost model convergence without the need
for well-designed sample mining. The category centers are well-optimized
classification proxies to re-balance the supervisory signal of each class.
Furthermore, the proposed loss combines the advantages of both contrastive and
classification methods by reducing intra-class variations and enhancing
inter-class differences to improve the discriminative power of embeddings. Our
experimental results, as shown in Figure 1, demonstrate that a standard network
(ResNet50) trained with our loss achieves state-of-the-art performance and
faster convergence. |
---|---|
DOI: | 10.48550/arxiv.2308.00458 |