no code implementations • 19 Dec 2023 • Luyao Guo, Luqing Wang, Xinli Shi, Jinde Cao
Distributed optimization methods with probabilistic local updates have recently gained attention for their provable ability to communication acceleration.
no code implementations • 12 Oct 2023 • Luyao Guo, Sulaiman A. Alghunaim, Kun Yuan, Laurent Condat, Jinde Cao
We demonstrate that the leading communication complexity of ProxSkip is $\mathcal{O}\left(\frac{p\sigma^2}{n\epsilon^2}\right)$ for non-convex and convex settings, and $\mathcal{O}\left(\frac{p\sigma^2}{n\epsilon}\right)$ for the strongly convex setting, where $n$ represents the number of nodes, $p$ denotes the probability of communication, $\sigma^2$ signifies the level of stochastic noise, and $\epsilon$ denotes the desired accuracy level.
no code implementations • 7 Feb 2023 • Luyao Guo, Xinli Shi, Jinde Cao, ZiHao Wang
The proposed algorithm uses uncoordinated network-independent constant stepsizes and only needs to approximately solve a sequence of proximal mappings, which is advantageous for solving decentralized composite optimization problems where the proximal mappings of the nonsmooth loss functions may not have analytical solutions.
no code implementations • 6 Dec 2022 • Luyao Guo, Jinde Cao, Xinli Shi, Shaofu Yang
In this paper, we propose a novel primal-dual proximal splitting algorithm (PD-PSA), named BALPA, for the composite optimization problem with equality constraints, where the loss function consists of a smooth term and a nonsmooth term composed with a linear mapping.
no code implementations • 5 Sep 2022 • Luyao Guo, Xinli Shi, Shaofu Yang, Jinde Cao
In this paper, we propose a novel Dual Inexact Splitting Algorithm (DISA) for distributed convex composite optimization problems, where the local loss function consists of a smooth term and a possibly nonsmooth term composed with a linear mapping.