Structure learning via unstructured kernel-based M-regression
In statistical learning, identifying underlying structures of true target functions based on observed data plays a crucial role to facilitate subsequent modeling and analysis. Unlike most of those existing methods that focus on some specific settings under certain model assumptions, this paper propo...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In statistical learning, identifying underlying structures of true target
functions based on observed data plays a crucial role to facilitate subsequent
modeling and analysis. Unlike most of those existing methods that focus on some
specific settings under certain model assumptions, this paper proposes a
general and novel framework for recovering true structures of target functions
by using unstructured M-regression in a reproducing kernel Hilbert space
(RKHS). The proposed framework is inspired by the fact that gradient functions
can be employed as a valid tool to learn underlying structures, including
sparse learning, interaction selection and model identification, and it is easy
to implement by taking advantage of the nice properties of the RKHS. More
importantly, it admits a wide range of loss functions, and thus includes many
commonly used methods, such as mean regression, quantile regression,
likelihood-based classification, and margin-based classification, which is also
computationally efficient by solving convex optimization tasks. The asymptotic
results of the proposed framework are established within a rich family of loss
functions without any explicit model specifications. The superior performance
of the proposed framework is also demonstrated by a variety of simulated
examples and a real case study. |
---|---|
DOI: | 10.48550/arxiv.1901.00615 |