Search Results for author: Michiel E. Hochstenbach

Found 2 papers, 0 papers with code

On the Convergence of the Gradient Descent Method with Stochastic Fixed-point Rounding Errors under the Polyak-Lojasiewicz Inequality

no code implementations23 Jan 2023 Lu Xia, Michiel E. Hochstenbach, Stefano Massei

When training neural networks with low-precision computation, rounding errors often cause stagnation or are detrimental to the convergence of the optimizers; in this paper we study the influence of rounding errors on the convergence of the gradient descent method for problems satisfying the Polyak-Lojasiewicz inequality.

On the influence of stochastic roundoff errors and their bias on the convergence of the gradient descent method with low-precision floating-point computation

no code implementations24 Feb 2022 Lu Xia, Stefano Massei, Michiel E. Hochstenbach, Barry Koren

When implementing the gradient descent method in low precision, the employment of stochastic rounding schemes helps to prevent stagnation of convergence caused by the vanishing gradient effect.

Cannot find the paper you are looking for? You can Submit a new open access paper.