HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Estimate Sequences for Variance-Reduced Stochastic Composite Optimization

Andrei Kulunchakov 1 Julien Mairal 1
1 Thoth - Apprentissage de modèles à partir de données massives
Inria Grenoble - Rhône-Alpes, LJK - Laboratoire Jean Kuntzmann
Abstract : In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. This point of view covers the stochastic gradient descent method, variants of the approaches SAGA, SVRG, and has several advantages: (i) we provide a generic proof of convergence for the aforementioned methods; (ii) we show that this SVRG variant is adaptive to strong convexity; (iii) we naturally obtain new algorithms with the same guarantees; (iv) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we show that this viewpoint is useful to obtain new accelerated algorithms in the sense of Nesterov.
Complete list of metadata

Contributor : Julien Mairal Connect in order to contact the contributor
Submitted on : Monday, May 6, 2019 - 10:48:28 PM
Last modification on : Friday, February 4, 2022 - 3:21:15 AM


Files produced by the author(s)


  • HAL Id : hal-02121913, version 1
  • ARXIV : 1905.02374



Andrei Kulunchakov, Julien Mairal. Estimate Sequences for Variance-Reduced Stochastic Composite Optimization. ICML 2019 - 36th International Conference on Machine Learning, Jun 2019, Long Beach, United States. pp.3541-3550. ⟨hal-02121913⟩



Record views


Files downloads