no code implementations • 13 Jul 2023 • Rishabh Dixit, Mert Gurbuzbalaban, Waheed U. Bajwa
This work also develops two metrics of asymptotic rate of convergence and divergence, and evaluates these two metrics for several popular standard accelerated methods such as the NAG, and Nesterov's accelerated gradient with constant momentum (NCM) near strict saddle points.
no code implementations • 7 Jan 2021 • Rishabh Dixit, Mert Gurbuzbalaban, Waheed U. Bajwa
This paper concerns convergence of first-order discrete methods to a local minimum of nonconvex optimization problems that comprise strict-saddle points within the geometrical landscape.
no code implementations • 1 Jun 2020 • Rishabh Dixit, Mert Gurbuzbalaban, Waheed U. Bajwa
This paper considers the problem of understanding the exit time for trajectories of gradient-related first-order methods from saddle neighborhoods under some initial boundary conditions.
no code implementations • 16 May 2019 • Rishabh Dixit, Amrit Singh Bedi, Ketan Rajawat
The empirical performance of the proposed algorithm is tested on the distributed dynamic sparse recovery problem, where it is shown to incur a dynamic regret that is close to that of the centralized algorithm.