no code implementations • 27 May 2024 • Dongyan Huo, Yixuan Zhang, Yudong Chen, Qiaomin Xie
By leveraging the smoothness and recurrence properties of the SA updates, we develop a fine-grained analysis of the correlation between the SA iterates $\theta_k$ and Markovian data $x_k$.
no code implementations • 9 Apr 2024 • Yixuan Zhang, Dongyan Huo, Yudong Chen, Qiaomin Xie
Motivated by Q-learning, we study nonsmooth contractive stochastic approximation (SA) with constant stepsize.
no code implementations • 18 Dec 2023 • Dongyan Huo, Yudong Chen, Qiaomin Xie
Our procedure leverages the fast mixing property of constant-stepsize LSA for better covariance estimation and employs Richardson-Romberg (RR) extrapolation to reduce the bias induced by constant stepsize and Markovian data.
no code implementations • 3 Oct 2022 • Dongyan Huo, Yudong Chen, Qiaomin Xie
We consider Linear Stochastic Approximation (LSA) with a constant stepsize and Markovian data.