no code implementations • 24 Jan 2024 • Thomas S. Richardson, James M. Robins
In this note we give proofs for results relating to the Instrumental Variable (IV) model with binary response $Y$ and binary treatment $X$, but with an instrument $Z$ with $K$ states.
no code implementations • 18 Jun 2023 • Lin Liu, Rajarshi Mukherjee, James M. Robins
In many instances, an analyst justifies her claim by imposing complexity-reducing assumptions on $b$ and $p$ to ensure "rate double-robustness".
1 code implementation • 19 May 2022 • Benjamin Kompa, David R. Bellamy, Thomas Kolokotrones, James M. Robins, Andrew L. Beam
In this work, we introduce a flexible and scalable method based on a deep neural network to estimate causal effects in the presence of unmeasured confounding using proximal inference.
no code implementations • 13 Aug 2020 • Ilya Shpitser, Thomas S. Richardson, James M. Robins
Among Judea Pearl's many contributions to Causality and Statistics, the graphical d-separation} criterion, the do-calculus and the mediation formula stand out.
Methodology 62P10
no code implementations • 7 Aug 2020 • Lin Liu, Rajarshi Mukherjee, James M. Robins
This is the rejoinder to the discussion by Kennedy, Balakrishnan and Wasserman on the paper "On nearly assumption-free tests of nominal confidence interval coverage for causal parameters estimated by machine learning" published in Statistical Science.
no code implementations • 29 Jun 2019 • Rohit Bhattacharya, Razieh Nabi, Ilya Shpitser, James M. Robins
Missing data is a pervasive problem in data analyses, resulting in datasets that contain censored realizations of a target distribution.
no code implementations • 8 Apr 2019 • Lin Liu, Rajarshi Mukherjee, James M. Robins
In this paper, we introduce essentially assumption-free tests that (i) can falsify the null hypothesis that the bias of $\hat{\psi}_{1}$ is of smaller order than its standard error, (ii) can provide an upper confidence bound on the true coverage of the Wald interval, and (iii) are valid under the null under no smoothness/sparsity assumptions on the nuisance parameters.
no code implementations • 7 Apr 2019 • Ezequiel Smucler, Andrea Rotnitzky, James M. Robins
We focus on a class of parameters that have influence function which depends on two infinite dimensional nuisance functions and such that the bias of the one-step estimator of the parameter of interest is the expectation of the product of the estimation errors of the two nuisance functions.
no code implementations • NeurIPS 2015 • Kirthevasan Kandasamy, Akshay Krishnamurthy, Barnabas Poczos, Larry Wasserman, James M. Robins
We propose and analyse estimators for statistical functionals of one or moredistributions under nonparametric assumptions. Our estimators are derived from the von Mises expansion andare based on the theory of influence functions, which appearin the semiparametric statistics literature. We show that estimators based either on data-splitting or a leave-one-out techniqueenjoy fast rates of convergence and other favorable theoretical properties. We apply this framework to derive estimators for several popular informationtheoretic quantities, and via empirical evaluation, show the advantage of thisapproach over existing estimators.
2 code implementations • 17 Nov 2014 • Kirthevasan Kandasamy, Akshay Krishnamurthy, Barnabas Poczos, Larry Wasserman, James M. Robins
We propose and analyze estimators for statistical functionals of one or more distributions under nonparametric assumptions.
no code implementations • 26 Sep 2013 • Ilya Shpitser, Robin J. Evans, Thomas S. Richardson, James M. Robins
To make modeling and inference with nested Markov models practical, it is necessary to limit the number of parameters in the model, while still correctly capturing the constraints in the marginal of a DAG model.