Skip to Main content Skip to Navigation
Conference papers

Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements

Kimon Antonakopoulos 1 Panayotis Mertikopoulos 1, 2 
1 POLARIS - Performance analysis and optimization of LARge Infrastructures and Systems
Inria Grenoble - Rhône-Alpes, LIG - Laboratoire d'Informatique de Grenoble
Abstract : We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth in the standard sense. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function-as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objective, such as Fisher markets, Poisson tomography, D-design, and the like. In this setting, the application of existing order-optimal adaptive methods-like UnixGrad or AcceleGrad-is not possible, especially in the presence of randomness and uncertainty. The proposed method, adaptive mirror descent (AdaMir), aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.
Document type :
Conference papers
Complete list of metadata
Contributor : Panayotis Mertikopoulos Connect in order to contact the contributor
Submitted on : Monday, September 13, 2021 - 5:11:42 PM
Last modification on : Wednesday, July 6, 2022 - 4:16:28 AM


Files produced by the author(s)


  • HAL Id : hal-03342998, version 1


Kimon Antonakopoulos, Panayotis Mertikopoulos. Adaptive first-order methods revisited: Convex optimization without Lipschitz requirements. NeurIPS 2021 - 35th International Conference on Neural Information Processing Systems, Dec 2021, Virtual, Unknown Region. ⟨hal-03342998⟩



Record views


Files downloads