Normal Approximation and Confidence Region of Singular Subspaces

2 Jan 2019  ·  Dong Xia ·

This paper is on the normal approximation of singular subspaces when the noise matrix has i.i.d. entries. Our contributions are three-fold. First, we derive an explicit representation formula of the empirical spectral projectors. The formula is neat and holds for deterministic matrix perturbations. Second, we calculate the expected projection distance between the empirical singular subspaces and true singular subspaces. Our method allows obtaining arbitrary $k$-th order approximation of the expected projection distance. Third, we prove the non-asymptotical normal approximation of the projection distance with different levels of bias corrections. By the $\lceil \log(d_1+d_2)\rceil$-th order bias corrections, the asymptotical normality holds under optimal signal-to-noise ration (SNR) condition where $d_1$ and $d_2$ denote the matrix sizes. In addition, it shows that higher order approximations are unnecessary when $|d_1-d_2|=O((d_1+d_2)^{1/2})$. Finally, we provide comprehensive simulation results to merit our theoretic discoveries. Unlike the existing results, our approach is non-asymptotical and the convergence rates are established. Our method allows the rank $r$ to diverge as fast as $o((d_1+d_2)^{1/3})$. Moreover, our method requires no eigen-gap condition (except the SNR) and no constraints between $d_1$ and $d_2$.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here