SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2014

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

Abstract

In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.
Fichier principal
Vignette du fichier
nips2014.pdf (376.92 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-01016843 , version 1 (01-07-2014)
hal-01016843 , version 2 (17-07-2014)
hal-01016843 , version 3 (12-11-2014)

Identifiers

Cite

Aaron Defazio, Francis Bach, Simon Lacoste-Julien. SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives. Advances In Neural Information Processing Systems, Nov 2014, Montreal, Canada. ⟨hal-01016843v3⟩
909 View
859 Download

Altmetric

Share

Gmail Facebook X LinkedIn More