Adaptive Function-on-Scalar Regression with a Smoothing Elastic Net
This paper presents a new methodology, called AFSSEN, to simultaneously select significant predictors and produce smooth estimates in a high-dimensional function-on-scalar linear model with a sub-Gaussian errors. Outcomes are assumed to lie in a general real separable Hilbert space, H, while paramet...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a new methodology, called AFSSEN, to simultaneously
select significant predictors and produce smooth estimates in a
high-dimensional function-on-scalar linear model with a sub-Gaussian errors.
Outcomes are assumed to lie in a general real separable Hilbert space, H, while
parameters lie in a subspace known as a Cameron Martin space, K, which are
closely related to Reproducing Kernel Hilbert Spaces, so that parameter
estimates inherit particular properties, such as smoothness or periodicity,
without enforcing such properties on the data. We propose a regularization
method in the style of an adaptive Elastic Net penalty that involves mixing two
types of functional norms, providing a fine tune control of both the smoothing
and variable selection in the estimated model. Asymptotic theory is provided in
the form of a functional oracle property, and the paper concludes with a
simulation study demonstrating the advantage of using AFSSEN over existing
methods in terms of prediction error and variable selection. |
---|---|
DOI: | 10.48550/arxiv.1905.09881 |