Inconsistent Few-Shot Relation Classification via Cross-Attentional Prototype Networks with Contrastive Learning
Standard few-shot relation classification (RC) is designed to learn a robust classifier with only few labeled data for each class. However, previous works rarely investigate the effects of a different number of classes (i.e., $N$-way) and number of labeled data per class (i.e., $K$-shot) during trai...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Standard few-shot relation classification (RC) is designed to learn a robust
classifier with only few labeled data for each class. However, previous works
rarely investigate the effects of a different number of classes (i.e., $N$-way)
and number of labeled data per class (i.e., $K$-shot) during training vs.
testing. In this work, we define a new task, \textit{inconsistent few-shot RC},
where the model needs to handle the inconsistency of $N$ and $K$ between
training and testing. To address this new task, we propose Prototype
Network-based cross-attention contrastive learning (ProtoCACL) to capture the
rich mutual interactions between the support set and query set. Experimental
results demonstrate that our ProtoCACL can outperform the state-of-the-art
baseline model under both inconsistent $K$ and inconsistent $N$ settings, owing
to its more robust and discriminate representations. Moreover, we identify that
in the inconsistent few-shot learning setting, models can achieve better
performance with \textit{less data} than the standard few-shot setting with
carefully-selected $N$ and $K$. In the end of the paper, we provide further
analyses and suggestions to systematically guide the selection of $N$ and $K$
under different scenarios. |
---|---|
DOI: | 10.48550/arxiv.2110.08254 |