The last-iterate convergence rate of optimistic mirror descent in stochastic variational inequalities - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

The last-iterate convergence rate of optimistic mirror descent in stochastic variational inequalities

Résumé

In this paper, we analyze the convergence rate of optimistic mirror descent methods in stochastic variational inequalities, a class of optimization problems with important applications to learning theory and machine learning. Our analysis reveals an intricate relation between the algorithm's rate of convergence and the local geometry induced by the method's underlying Bregman function. We quantify this relation by means of the Legendre exponent, a notion that we introduce to measure the growth rate of the Bregman divergence relative to the ambient norm near a solution. We show that this exponent determines both the optimal step-size policy of the algorithm and the optimal rates attained, explaining in this way the differences observed for some popular Bregman functions (Euclidean projection, negative entropy, fractional power, etc.).
Fichier principal
Vignette du fichier
2021-COLT-BregmanRates.pdf (444.04 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03342583 , version 1 (13-09-2021)

Identifiants

  • HAL Id : hal-03342583 , version 1

Citer

Waïss Azizian, Franck Iutzeler, Jérôme Malick, Panayotis Mertikopoulos. The last-iterate convergence rate of optimistic mirror descent in stochastic variational inequalities. COLT 2021 - 34th Annual Conference on Learning Theory, Aug 2021, Boulder, United States. pp.1-32. ⟨hal-03342583⟩
111 Consultations
125 Téléchargements

Partager

Gmail Facebook X LinkedIn More