Constant Step Size Stochastic Gradient Descent for Probabilistic Modeling - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Communication Dans Un Congrès Année : 2018

Constant Step Size Stochastic Gradient Descent for Probabilistic Modeling

Résumé

Stochastic gradient methods enable learning probabilistic models from large amounts of data. While large step-sizes (learning rates) have shown to be best for least-squares (e.g., Gaussian noise) once combined with parameter averaging, these are not leading to con-vergent algorithms in general. In this paper , we consider generalized linear models, that is, conditional models based on exponential families. We propose averaging moment parameters instead of natural parameters for constant-step-size stochastic gradient descent. For finite-dimensional models, we show that this can sometimes (and surprisingly) lead to better predictions than the best linear model. For infinite-dimensional models, we show that it always converges to optimal predictions, while averaging natural parameters never does. We illustrate our findings with simulations on synthetic data and classical benchmarks with many observations.
Fichier principal
Vignette du fichier
Averaging_predictions.pdf (496.78 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01929810 , version 1 (21-11-2018)

Identifiants

Citer

Dmitry Babichev, Francis Bach. Constant Step Size Stochastic Gradient Descent for Probabilistic Modeling. UAI 2018 - Conference on Uncertainty in Artificial Intelligence, Aug 2018, Monterey, United States. ⟨hal-01929810⟩
60 Consultations
59 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More