Aggregative Self-Supervised Feature Learning from a Limited Sample
Self-supervised learning (SSL) is an efficient approach that addresses the issue of limited training data and annotation shortage. The key part in SSL is its proxy task that defines the supervisory signals and drives the learning toward effective feature representations. However, most SSL approaches...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Self-supervised learning (SSL) is an efficient approach that addresses the
issue of limited training data and annotation shortage. The key part in SSL is
its proxy task that defines the supervisory signals and drives the learning
toward effective feature representations. However, most SSL approaches usually
focus on a single proxy task, which greatly limits the expressive power of the
learned features and therefore deteriorates the network generalization
capacity. In this regard, we hereby propose two strategies of aggregation in
terms of complementarity of various forms to boost the robustness of
self-supervised learned features. We firstly propose a principled framework of
multi-task aggregative self-supervised learning from a limited sample to form a
unified representation, with an intent of exploiting feature complementarity
among different tasks. Then, in self-aggregative SSL, we propose to
self-complement an existing proxy task with an auxiliary loss function based on
a linear centered kernel alignment metric, which explicitly promotes the
exploring of where are uncovered by the features learned from a proxy task at
hand to further boost the modeling capability. Our extensive experiments on 2D
natural image and 3D medical image classification tasks under limited data and
annotation scenarios confirm that the proposed aggregation strategies
successfully boost the classification accuracy. |
---|---|
DOI: | 10.48550/arxiv.2012.07477 |