On the convergence of gradient-like flows with noisy gradient input - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Optimization Année : 2018

On the convergence of gradient-like flows with noisy gradient input

Résumé

In view of solving convex optimization problems with noisy gradient input, we analyze the asymptotic behavior of gradient-like flows that are subject to stochastic disturbances. Specifically, we focus on the widely studied class of mirror descent methods for constrained convex programming and we examine the dynamics' convergence and concentration properties in the presence of noise. In the small noise limit, we show that the dynamics converge to the solution set of the underlying problem (a.s.). Otherwise, if the noise is persistent, we estimate the measure of the dynamics' long-run concentration around interior solutions and their convergence to boundary solutions that are sufficiently "robust". Finally, we show that a suitably rectified variant of the method converges irrespective of the magnitude of the noise (or the structure of the underlying convex program), and we derive an explicit estimate for its rate of convergence.
Fichier principal
Vignette du fichier
Main.pdf (970.36 Ko) Télécharger le fichier
StochasticMirrorDescent.pdf (1.08 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01404586 , version 1 (28-11-2016)

Identifiants

  • HAL Id : hal-01404586 , version 1

Citer

Panayotis Mertikopoulos, Mathias Staudigl. On the convergence of gradient-like flows with noisy gradient input. SIAM Journal on Optimization, 2018, 28 (1), pp.163-197. ⟨hal-01404586⟩
141 Consultations
308 Téléchargements

Partager

Gmail Facebook X LinkedIn More