Fast Few-shot Debugging for NLU Test Suites
We study few-shot debugging of transformer based natural language understanding models, using recently popularized test suites to not just diagnose but correct a problem. Given a few debugging examples of a certain phenomenon, and a held-out test set of the same phenomenon, we aim to maximize accura...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We study few-shot debugging of transformer based natural language
understanding models, using recently popularized test suites to not just
diagnose but correct a problem. Given a few debugging examples of a certain
phenomenon, and a held-out test set of the same phenomenon, we aim to maximize
accuracy on the phenomenon at a minimal cost of accuracy on the original test
set. We examine several methods that are faster than full epoch retraining. We
introduce a new fast method, which samples a few in-danger examples from the
original training set. Compared to fast methods using parameter distance
constraints or Kullback-Leibler divergence, we achieve superior original
accuracy for comparable debugging accuracy. |
---|---|
DOI: | 10.48550/arxiv.2204.06555 |