Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite-Sum Structure

Abstract : Stochastic optimization algorithms with variance reduction have proven successful for minimizing large finite sums of functions. Unfortunately, these techniques are unable to deal with stochastic perturbations of input data, induced for example by data augmentation. In such cases, the objective is no longer a finite sum, and the main candidate for optimization is the stochastic gradient descent method (SGD). In this paper, we introduce a variance reduction approach for these settings when the objective is composite and strongly convex. The convergence rate outperforms SGD with a typically much smaller constant factor, which depends on the variance of gradient estimates only due to perturbations on a single example.
Type de document :
Communication dans un congrès
NIPS 2017 - Advances in Neural Information Processing Systems, Dec 2017, Long Beach, CA, United States. pp.1-21
Liste complète des métadonnées

https://hal.inria.fr/hal-01375816
Contributeur : Alberto Bietti <>
Soumis le : mercredi 15 novembre 2017 - 13:24:21
Dernière modification le : vendredi 24 novembre 2017 - 13:27:50

Identifiants

  • HAL Id : hal-01375816, version 6
  • ARXIV : 1610.00970

Collections

Citation

Alberto Bietti, Julien Mairal. Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite-Sum Structure. NIPS 2017 - Advances in Neural Information Processing Systems, Dec 2017, Long Beach, CA, United States. pp.1-21. 〈hal-01375816v6〉

Partager

Métriques

Consultations de la notice

67

Téléchargements de fichiers

32