Skip to Main content Skip to Navigation
New interface
Preprints, Working Papers, ...

Super-Acceleration with Cyclical Step-sizes

Abstract : Cyclical step-sizes are becoming increasingly popular in the optimization of deep learning problems. Motivated by recent observations on the spectral gaps of Hessians in machine learning, we show that these step-size schedules offer a simple way to exploit them. More precisely, we develop a convergence rate analysis for quadratic objectives that provides optimal parameters and shows that cyclical learning rates can improve upon traditional lower complexity bounds. We further propose a systematic approach to design optimal first order methods for quadratic minimization with a given spectral structure. Finally, we provide a local convergence rate analysis beyond quadratic minimization for the proposed methods and illustrate our findings through benchmarks on least squares and logistic regression problems.
Document type :
Preprints, Working Papers, ...
Complete list of metadata
Contributor : Adrien Taylor Connect in order to contact the contributor
Submitted on : Thursday, October 14, 2021 - 10:33:37 AM
Last modification on : Friday, November 18, 2022 - 9:23:02 AM

Links full text


  • HAL Id : hal-03377367, version 1
  • ARXIV : 2106.09687


Baptiste Goujaud, Damien Scieur, Aymeric Dieuleveut, Adrien Taylor, Fabian Pedregosa. Super-Acceleration with Cyclical Step-sizes. {date}. ⟨hal-03377367⟩



Record views