Polynomial regression under arbitrary product distributions
In recent work, Kalai, Klivans, Mansour, and Servedio ( 2005 ) studied a variant of the “Low-Degree (Fourier) Algorithm” for learning under the uniform probability distribution on {0,1} n . They showed that the L 1 polynomial regression algorithm yields agnostic (tolerant to arbitrary noise) learnin...
Gespeichert in:
Veröffentlicht in: | Machine learning 2010-09, Vol.80 (2-3), p.273-294 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent work, Kalai, Klivans, Mansour, and Servedio (
2005
) studied a variant of the “Low-Degree (Fourier) Algorithm” for learning under the uniform probability distribution on {0,1}
n
. They showed that the
L
1
polynomial regression algorithm yields
agnostic
(tolerant to arbitrary noise) learning algorithms with respect to the class of threshold functions—under certain restricted instance distributions, including uniform on {0,1}
n
and Gaussian on ℝ
n
. In this work we show how
all
learning results based on the Low-Degree Algorithm can be generalized to give almost identical agnostic guarantees under
arbitrary
product distributions on instance spaces
X
1
×⋅⋅⋅×
X
n
. We also extend these results to learning under
mixtures
of product distributions.
The main technical innovation is the use of (Hoeffding) orthogonal decomposition and the extension of the “noise sensitivity method” to arbitrary product spaces. In particular, we give a very simple proof that threshold functions over arbitrary product spaces have
δ
-noise sensitivity
, resolving an open problem suggested by Peres (
2004
). |
---|---|
ISSN: | 0885-6125 1573-0565 |
DOI: | 10.1007/s10994-010-5179-6 |