1 code implementation • 22 Jan 2024 • Marlon Becker, Frederick Altrock, Benjamin Risse
The recently proposed optimization algorithm for deep neural networks Sharpness Aware Minimization (SAM) suggests perturbing parameters before gradient calculation by a gradient ascent step to guide the optimization into parameter space regions of flat loss.