Skip to Main content Skip to Navigation
Journal articles

Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

Andrei Kulunchakov 1 Julien Mairal 1
1 Thoth [2020-....] - Apprentissage de modèles à partir de données massives [2020-....]
Inria Grenoble - Rhône-Alpes, LJK [2020-....] - Laboratoire Jean Kuntzmann [2020-....]
Abstract : In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization by extending the concept of estimate sequence introduced by Nesterov. More precisely, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective, which covers the stochastic gradient descent method and variants of the incremental approaches SAGA, SVRG, and MISO/Finito/SDCA. This point of view has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we propose a new accelerated stochastic gradient descent algorithm and an accelerated SVRG algorithm with optimal complexity that is robust to stochastic noise.
Complete list of metadatas

https://hal.inria.fr/hal-01993531
Contributor : Julien Mairal <>
Submitted on : Friday, September 4, 2020 - 9:05:10 AM
Last modification on : Tuesday, September 8, 2020 - 1:35:19 PM

Files

19-073.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01993531, version 4
  • ARXIV : 1901.08788

Collections

Citation

Andrei Kulunchakov, Julien Mairal. Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise. Journal of Machine Learning Research, Microtome Publishing, In press. ⟨hal-01993531v4⟩

Share

Metrics

Record views

32

Files downloads

55