Simple and Fast Group Robustness by Automatic Feature Reweighting
40th International Conference on Machine Learning 2023 A major challenge to out-of-distribution generalization is reliance on spurious features -- patterns that are predictive of the class label in the training data distribution, but not causally related to the target. Standard methods for reducing...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | 40th International Conference on Machine Learning 2023 A major challenge to out-of-distribution generalization is reliance on
spurious features -- patterns that are predictive of the class label in the
training data distribution, but not causally related to the target. Standard
methods for reducing the reliance on spurious features typically assume that we
know what the spurious feature is, which is rarely true in the real world.
Methods that attempt to alleviate this limitation are complex, hard to tune,
and lead to a significant computational overhead compared to standard training.
In this paper, we propose Automatic Feature Reweighting (AFR), an extremely
simple and fast method for updating the model to reduce the reliance on
spurious features. AFR retrains the last layer of a standard ERM-trained base
model with a weighted loss that emphasizes the examples where the ERM model
predicts poorly, automatically upweighting the minority group without group
labels. With this simple procedure, we improve upon the best reported results
among competing methods trained without spurious attributes on several vision
and natural language classification benchmarks, using only a fraction of their
compute. |
---|---|
DOI: | 10.48550/arxiv.2306.11074 |