A Generic Acceleration Framework for Stochastic Composite Optimization - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

A Generic Acceleration Framework for Stochastic Composite Optimization

Résumé

In this paper, we introduce various mechanisms to obtain accelerated first-order stochastic optimization algorithms when the objective function is convex or strongly convex. Specifically, we extend the Catalyst approach originally designed for deterministic objectives to the stochastic setting. Given an optimization method with mild convergence guarantees for strongly convex problems, the challenge is to accelerate convergence to a noise-dominated region, and then achieve convergence with an optimal worst-case complexity depending on the noise variance of the gradients. A side contribution of our work is also a generic analysis that can handle inexact proximal operators, providing new insights about the robustness of stochastic algorithms when the proximal operator cannot be exactly computed.
Fichier principal
Vignette du fichier
main.pdf (710.57 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-02139489 , version 1 (29-05-2019)
hal-02139489 , version 2 (03-10-2019)
hal-02139489 , version 3 (07-10-2019)

Identifiants

Citer

Andrei Kulunchakov, Julien Mairal. A Generic Acceleration Framework for Stochastic Composite Optimization. NeurIPS 2019 - Thirty-third Conference Neural Information Processing Systems, Dec 2019, Vancouver, Canada. pp.12556-12567. ⟨hal-02139489v3⟩
151 Consultations
291 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More