Bagging Provides Assumption-free Stability
Bagging is an important technique for stabilizing machine learning models. In this paper, we derive a finite-sample guarantee on the stability of bagging for any model. Our result places no assumptions on the distribution of the data, on the properties of the base algorithm, or on the dimensionality...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Bagging is an important technique for stabilizing machine learning models. In
this paper, we derive a finite-sample guarantee on the stability of bagging for
any model. Our result places no assumptions on the distribution of the data, on
the properties of the base algorithm, or on the dimensionality of the
covariates. Our guarantee applies to many variants of bagging and is optimal up
to a constant. Empirical results validate our findings, showing that bagging
successfully stabilizes even highly unstable base algorithms. |
---|---|
DOI: | 10.48550/arxiv.2301.12600 |