Fisher Information and Mutual Information Constraints

11 Feb 2021  ·  Leighton Pate Barnes, Ayfer Ozgur ·

We consider the processing of statistical samples $X\sim P_\theta$ by a channel $p(y|x)$, and characterize how the statistical information from the samples for estimating the parameter $\theta\in\mathbb{R}^d$ can scale with the mutual information or capacity of the channel. We show that if the statistical model has a sub-Gaussian score function, then the trace of the Fisher information matrix for estimating $\theta$ from $Y$ can scale at most linearly with the mutual information between $X$ and $Y$. We apply this result to obtain minimax lower bounds in distributed statistical estimation problems, and obtain a tight preconstant for Gaussian mean estimation. We then show how our Fisher information bound can also imply mutual information or Jensen-Shannon divergence based distributed strong data processing inequalities.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Information Theory Information Theory Statistics Theory Statistics Theory

Datasets


  Add Datasets introduced or used in this paper