Acceleration through spectral density estimation

ICML 2020  ·  Fabian Pedregosa, Damien Scieur ·

We develop a framework for designing optimal optimization methods in terms of their average-case runtime. This yields a new class of methods that achieve acceleration through a model of the Hessian's expected spectral density. We develop explicit algorithms for the uniform, Marchenko-Pastur and exponential distribution. These methods are momentum-based gradient algorithms whose hyper-parameters can be estimated cheaply using only the norm and the trace of the Hessian, in stark contrast with classical accelerated methods like Nesterov acceleration and Polyak momentum that require knowledge of the Hessian's largest and smallest singular value. Empirical results on quadratic, logistic regression and neural network show the proposed methods always match and in many cases significantly improve upon classical accelerated methods.

PDF ICML 2020 PDF

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here