A Generic Acceleration Framework for Stochastic Composite Optimization

Andrei Kulunchakov 1 Julien Mairal 1
1 Thoth - Apprentissage de modèles à partir de données massives
LJK - Laboratoire Jean Kuntzmann, Inria Grenoble - Rhône-Alpes
Abstract : In this paper, we introduce various mechanisms to obtain accelerated first-order stochastic optimization algorithms when the objective function is convex or strongly convex. Specifically, we extend the Catalyst approach originally designed for deterministic objectives to the stochastic setting. Given an optimization method with mild convergence guarantees for strongly convex problems, the challenge is to accelerate convergence to a noise-dominated region, and then achieve convergence with an optimal worst-case complexity depending on the noise variance of the gradients. A side contribution of our work is also a generic analysis that can handle inexact proximal operators, providing new insights about the robustness of stochastic algorithms when the proximal operator cannot be exactly computed.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

https://hal.inria.fr/hal-02139489
Contributor : Julien Mairal <>
Submitted on : Wednesday, May 29, 2019 - 12:31:42 PM
Last modification on : Wednesday, June 5, 2019 - 11:06:44 AM

Files

main.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-02139489, version 1
  • ARXIV : 1906.01164

Collections

Citation

Andrei Kulunchakov, Julien Mairal. A Generic Acceleration Framework for Stochastic Composite Optimization. 2019. ⟨hal-02139489⟩

Share

Metrics

Record views

23

Files downloads

214