Rethinking Positive Pairs in Contrastive Learning
Contrastive learning, a prominent approach to representation learning, traditionally assumes positive pairs are closely related samples (the same image or class) and negative pairs are distinct samples. We challenge this assumption by proposing to learn from arbitrary pairs, allowing any pair of sam...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Contrastive learning, a prominent approach to representation learning,
traditionally assumes positive pairs are closely related samples (the same
image or class) and negative pairs are distinct samples. We challenge this
assumption by proposing to learn from arbitrary pairs, allowing any pair of
samples to be positive within our framework.The primary challenge of the
proposed approach lies in applying contrastive learning to disparate pairs
which are semantically distant. Motivated by the discovery that SimCLR can
separate given arbitrary pairs (e.g., garter snake and table lamp) in a
subspace, we propose a feature filter in the condition of class pairs that
creates the requisite subspaces by gate vectors selectively activating or
deactivating dimensions. This filter can be optimized through gradient descent
within a conventional contrastive learning mechanism.
We present Hydra, a universal contrastive learning framework for visual
representations that extends conventional contrastive learning to accommodate
arbitrary pairs. Our approach is validated using IN1K, where 1K diverse classes
compose 500,500 pairs, most of them being distinct. Surprisingly, Hydra
achieves superior performance in this challenging setting. Additional benefits
include the prevention of dimensional collapse and the discovery of class
relationships. Our work highlights the value of learning common features of
arbitrary pairs and potentially broadens the applicability of contrastive
learning techniques on the sample pairs with weak relationships. |
---|---|
DOI: | 10.48550/arxiv.2410.18200 |