DebiNet: Debiasing Linear Models with Nonlinear Overparameterized Neural Networks
Recent years have witnessed strong empirical performance of over-parameterized neural networks on various tasks and many advances in the theory, e.g. the universal approximation and provable convergence to global minimum. In this paper, we incorporate over-parameterized neural networks into semi-par...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recent years have witnessed strong empirical performance of
over-parameterized neural networks on various tasks and many advances in the
theory, e.g. the universal approximation and provable convergence to global
minimum. In this paper, we incorporate over-parameterized neural networks into
semi-parametric models to bridge the gap between inference and prediction,
especially in the high dimensional linear problem. By doing so, we can exploit
a wide class of networks to approximate the nuisance functions and to estimate
the parameters of interest consistently. Therefore, we may offer the best of
two worlds: the universal approximation ability from neural networks and the
interpretability from classic ordinary linear model, leading to both valid
inference and accurate prediction. We show the theoretical foundations that
make this possible and demonstrate with numerical experiments. Furthermore, we
propose a framework, DebiNet, in which we plug-in arbitrary feature selection
methods to our semi-parametric neural network. DebiNet can debias the
regularized estimators (e.g. Lasso) and perform well, in terms of the
post-selection inference and the generalization error. |
---|---|
DOI: | 10.48550/arxiv.2011.00417 |