1 code implementation • 5 Feb 2024 • Shahryar Zehtabi, Dong-Jun Han, Rohit Parasnis, Seyyedali Hosseinalipour, Christopher G. Brinton
Existing DFL works have mostly focused on settings where clients conduct a fixed number of local updates between local model exchanges, overlooking heterogeneity and dynamics in communication and computation capabilities.
no code implementations • 23 Nov 2022 • Shahryar Zehtabi, Seyyedali Hosseinalipour, Christopher G. Brinton
We theoretically demonstrate that our methodology converges to the globally optimal learning model at a $O{(\frac{\ln{k}}{\sqrt{k}})}$ rate under standard assumptions in distributed learning and consensus literature.
1 code implementation • 7 Apr 2022 • Shahryar Zehtabi, Seyyedali Hosseinalipour, Christopher G. Brinton
Through theoretical analysis, we demonstrate that our methodology achieves asymptotic convergence to the globally optimal learning model under standard assumptions in distributed learning and graph consensus literature, and without restrictive connectivity requirements on the underlying topology.