Skip to Main content Skip to Navigation
Conference papers

Stochastic Composite Least-Squares Regression with Convergence Rate O(1/n)

Nicolas Flammarion 1, 2 Francis Bach 1, 2
2 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We consider the optimization of a quadratic objective function whose gradients are only accessible through a stochastic oracle that returns the gradient at any given point plus a zero-mean finite variance random error. We present the first algorithm that achieves jointly the optimal prediction error rates for least-squares regression, both in terms of forgetting of initial conditions in O(1/n 2), and in terms of dependence on the noise and dimension d of the problem, as O(d/n). Our new algorithm is based on averaged accelerated regularized gradient descent, and may also be analyzed through finer assumptions on initial conditions and the Hessian matrix, leading to dimension-free quantities that may still be small while the " optimal " terms above are large. In order to characterize the tightness of these new bounds, we consider an application to non-parametric regression and use the known lower bounds on the statistical performance (without computational limits), which happen to match our bounds obtained from a single pass on the data and thus show optimality of our algorithm in a wide variety of particular trade-offs between bias and variance.
Complete list of metadata

Cited literature [60 references]  Display  Hide  Download
Contributor : Nicolas Flammarion <>
Submitted on : Tuesday, February 21, 2017 - 1:57:32 PM
Last modification on : Tuesday, May 4, 2021 - 2:06:02 PM
Long-term archiving on: : Monday, May 22, 2017 - 3:24:03 PM


Files produced by the author(s)


  • HAL Id : hal-01472867, version 1
  • ARXIV : 1602.05419



Nicolas Flammarion, Francis Bach. Stochastic Composite Least-Squares Regression with Convergence Rate O(1/n). Proceedings of The 30th Conference on Learning Theory, (COLT), 2017, Amsterdam, Netherlands. ⟨hal-01472867⟩



Record views


Files downloads