Conditional Matrix Flows for Gaussian Graphical Models
Studying conditional independence among many variables with few observations is a challenging task. Gaussian Graphical Models (GGMs) tackle this problem by encouraging sparsity in the precision matrix through $l_q$ regularization with $q\leq1$. However, most GMMs rely on the $l_1$ norm because the o...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Studying conditional independence among many variables with few observations
is a challenging task. Gaussian Graphical Models (GGMs) tackle this problem by
encouraging sparsity in the precision matrix through $l_q$ regularization with
$q\leq1$. However, most GMMs rely on the $l_1$ norm because the objective is
highly non-convex for sub-$l_1$ pseudo-norms. In the frequentist formulation,
the $l_1$ norm relaxation provides the solution path as a function of the
shrinkage parameter $\lambda$. In the Bayesian formulation, sparsity is instead
encouraged through a Laplace prior, but posterior inference for different
$\lambda$ requires repeated runs of expensive Gibbs samplers. Here we propose a
general framework for variational inference with matrix-variate Normalizing
Flow in GGMs, which unifies the benefits of frequentist and Bayesian
frameworks. As a key improvement on previous work, we train with one flow a
continuum of sparse regression models jointly for all regularization parameters
$\lambda$ and all $l_q$ norms, including non-convex sub-$l_1$ pseudo-norms.
Within one model we thus have access to (i) the evolution of the posterior for
any $\lambda$ and any $l_q$ (pseudo-) norm, (ii) the marginal log-likelihood
for model selection, and (iii) the frequentist solution paths through simulated
annealing in the MAP limit. |
---|---|
DOI: | 10.48550/arxiv.2306.07255 |