Block-wise Primal-dual Algorithms for Large-scale Doubly Penalized ANOVA Modeling
For multivariate nonparametric regression, doubly penalized ANOVA modeling (DPAM) has recently been proposed, using hierarchical total variations (HTVs) and empirical norms as penalties on the component functions such as main effects and multi-way interactions in a functional ANOVA decomposition of...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | For multivariate nonparametric regression, doubly penalized ANOVA modeling
(DPAM) has recently been proposed, using hierarchical total variations (HTVs)
and empirical norms as penalties on the component functions such as main
effects and multi-way interactions in a functional ANOVA decomposition of the
underlying regression function. The two penalties play complementary roles: the
HTV penalty promotes sparsity in the selection of basis functions within each
component function, whereas the empirical-norm penalty promotes sparsity in the
selection of component functions. We adopt backfitting or block minimization
for training DPAM, and develop two suitable primal-dual algorithms, including
both batch and stochastic versions, for updating each component function in
single-block optimization. Existing applications of primal-dual algorithms are
intractable in our setting with both HTV and empirical-norm penalties. Through
extensive numerical experiments, we demonstrate the validity and advantage of
our stochastic primal-dual algorithms, compared with their batch versions and a
previous active-set algorithm, in large-scale scenarios. |
---|---|
DOI: | 10.48550/arxiv.2210.10991 |