Search Results for author: Haichao Sha

Found 2 papers, 0 papers with code

Clip Body and Tail Separately: High Probability Guarantees for DPSGD with Heavy Tails

no code implementations27 May 2024 Haichao Sha, Yang Cao, Yong liu, Yuncheng Wu, Ruixuan Liu, Hong Chen

However, recent studies have shown that the gradients in deep learning exhibit a heavy-tail phenomenon, that is, the tails of the gradient have infinite variance, which may lead to excessive clipping loss to the gradients with existing DPSGD mechanisms.

PCDP-SGD: Improving the Convergence of Differentially Private SGD via Projection in Advance

no code implementations6 Dec 2023 Haichao Sha, Ruixuan Liu, Yixuan Liu, Hong Chen

We prove that pre-projection enhances the convergence of DP-SGD by reducing the dependence of clipping error and bias to a fraction of the top gradient eigenspace, and in theory, limits cross-client variance to improve the convergence under heterogeneous federation.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.