Estimation of KL Divergence: Optimal Minimax Rate

20 Feb 2018 Bu Yuheng Zou Shaofeng Liang Yingbin Veeravalli Venugopal V.

The problem of estimating the Kullback-Leibler divergence $D(P\|Q)$ between two unknown distributions $P$ and $Q$ is studied, under the assumption that the alphabet size $k$ of the distributions can scale to infinity. The estimation is based on $m$ independent samples drawn from $P$ and $n$ independent samples drawn from $Q$... (read more)

PDF Abstract

Categories


  • INFORMATION THEORY
  • INFORMATION THEORY