no code implementations • 15 Nov 2023 • Albert S. Berahas, Lindon Roberts, Fred Roosta
The analysis of gradient descent-type methods typically relies on the Lipschitz continuity of the objective gradient.
no code implementations • 19 Aug 2023 • Mohammad Sadegh Salehi, Subhadip Mukherjee, Lindon Roberts, Matthias J. Ehrhardt
In this work, we propose an algorithm with backtracking line search that only relies on inexact function evaluations and hypergradients and show convergence to a stationary point.
no code implementations • 11 Jan 2023 • Matthias J. Ehrhardt, Lindon Roberts
Estimating hyperparameters has been a long-standing problem in machine learning.
no code implementations • 7 Nov 2022 • Andrew M. Kingston, Lindon Roberts, Alaleh Aminzadeh, Daniele Pelliccia, Imants D. Svalbe, David M. Paganin
Classical ghost imaging is a new paradigm in imaging where the image of an object is not measured directly with a pixelated detector.
no code implementations • 25 Aug 2022 • Lindon Roberts, Edward Smyth
In distributed learning, a central server trains a model according to updates provided by nodes holding local data samples.
1 code implementation • 6 Nov 2020 • Matthias J. Ehrhardt, Lindon Roberts
Here, we apply a recent dynamic accuracy derivative-free optimization method to hyperparameter tuning, which allows inexact evaluations of the learning problem while retaining convergence guarantees.
no code implementations • 26 Jul 2020 • Coralia Cartis, Tyler Ferguson, Lindon Roberts
Derivative-free - or zeroth-order - optimization (DFO) has gained recent attention for its ability to solve problems in a variety of application areas, including machine learning, particularly involving objectives which are stochastic and/or expensive to compute.
1 code implementation • 23 Jun 2020 • Matthias J. Ehrhardt, Lindon Roberts
A drawback of these techniques is that they are dependent on a number of parameters which have to be set by the user.
1 code implementation • 29 Dec 2018 • Coralia Cartis, Lindon Roberts, Oliver Sheridan-Methven
We apply a state-of-the-art, local derivative-free solver, Py-BOBYQA, to global optimization problems, and propose an algorithmic improvement that is beneficial in this context.
Optimization and Control
3 code implementations • 31 Mar 2018 • Coralia Cartis, Jan Fiala, Benjamin Marteau, Lindon Roberts
Numerical results show DFO-LS can gain reasonable progress on some medium-scale problems with fewer objective evaluations than is needed for one gradient evaluation.
Optimization and Control