Hybrid Deterministic-Stochastic Methods for Data Fitting - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue SIAM Journal on Scientific Computing Année : 2012

Hybrid Deterministic-Stochastic Methods for Data Fitting

Résumé

Many structured data-fitting applications require the solution of an optimization problem involving a sum over a potentially large number of measurements. Incremental gradient algorithms offer inexpensive iterations by sampling only subsets of the terms in the sum. These methods can make great progress initially, but often slow as they approach a solution. In contrast, full gradient methods achieve steady convergence at the expense of evaluating the full objective and gradient on each iteration. We explore hybrid methods that exhibit the benefits of both approaches. Rate of convergence analysis shows that by controlling the size of the subsets in an incremental gradient algorithm, it is possible to maintain the steady convergence rates of full gradient methods. We detail a practical quasi-Newton implementation based on this approach, and numerical experiments illustrate its potential benefits.

Dates et versions

inria-00626571 , version 1 (26-09-2011)

Identifiants

Citer

Michael P. Friedlander, Mark Schmidt. Hybrid Deterministic-Stochastic Methods for Data Fitting. SIAM Journal on Scientific Computing, 2012, 34 (3), pp.A1380-A1405. ⟨10.1137/110830629⟩. ⟨inria-00626571⟩
197 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More