Bridging multiple worlds: multi-marginal optimal transport for causal partial-identification problem
Under the prevalent potential outcome model in causal inference, each unit is associated with multiple potential outcomes but at most one of which is observed, leading to many causal quantities being only partially identified. The inherent missing data issue echoes the multi-marginal optimal transpo...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Under the prevalent potential outcome model in causal inference, each unit is
associated with multiple potential outcomes but at most one of which is
observed, leading to many causal quantities being only partially identified.
The inherent missing data issue echoes the multi-marginal optimal transport
(MOT) problem, where marginal distributions are known, but how the marginals
couple to form the joint distribution is unavailable. In this paper, we cast
the causal partial identification problem in the framework of MOT with $K$
margins and $d$-dimensional outcomes and obtain the exact partial identified
set. In order to estimate the partial identified set via MOT, statistically, we
establish a convergence rate of the plug-in MOT estimator for the $\ell_2$ cost
function stemming from the variance minimization problem and prove it is
minimax optimal for arbitrary $K$ and $d \le 4$. We also extend the convergence
result to general quadratic objective functions. Numerically, we demonstrate
the efficacy of our method over synthetic datasets and several real-world
datasets where our proposal consistently outperforms the baseline by a
significant margin (over 70%). In addition, we provide efficient off-the-shelf
implementations of MOT with general objective functions. |
---|---|
DOI: | 10.48550/arxiv.2406.07868 |