Identification and Auto-debiased Machine Learning for Outcome Conditioned Average Structural Derivatives

This paper proposes a new class of heterogeneous causal quantities, named \textit{outcome conditioned} average structural derivatives (OASD) in a general nonseparable model. OASD is the average partial effect of a marginal change in a continuous treatment on the individuals located at different part...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Jin, Zequn, Lin, Lihua, Zhang, Zhengyu
Format: Artikel
Sprache:eng
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper proposes a new class of heterogeneous causal quantities, named \textit{outcome conditioned} average structural derivatives (OASD) in a general nonseparable model. OASD is the average partial effect of a marginal change in a continuous treatment on the individuals located at different parts of the outcome distribution, irrespective of individuals' characteristics. OASD combines both features of ATE and QTE: it is interpreted as straightforwardly as ATE while at the same time more granular than ATE by breaking the entire population up according to the rank of the outcome distribution. One contribution of this paper is that we establish some close relationships between the \textit{outcome conditioned average partial effects} and a class of parameters measuring the effect of counterfactually changing the distribution of a single covariate on the unconditional outcome quantiles. By exploiting such relationship, we can obtain root-$n$ consistent estimator and calculate the semi-parametric efficiency bound for these counterfactual effect parameters. We illustrate this point by two examples: equivalence between OASD and the unconditional partial quantile effect (Firpo et al. (2009)), and equivalence between the marginal partial distribution policy effect (Rothe (2012)) and a corresponding outcome conditioned parameter. Because identification of OASD is attained under a conditional exogeneity assumption, by controlling for a rich information about covariates, a researcher may ideally use high-dimensional controls in data. We propose for OASD a novel automatic debiased machine learning estimator, and present asymptotic statistical guarantees for it. We prove our estimator is root-$n$ consistent, asymptotically normal, and semiparametrically efficient. We also prove the validity of the bootstrap procedure for uniform inference on the OASD process.
DOI:10.48550/arxiv.2211.07903