Skip to Main content Skip to Navigation
New interface
Journal articles

A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives

Mathieu Barré 1 Adrien Taylor 1 Francis Bach 1 
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique - ENS Paris, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov's accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative and absolute errors, and its behavior is illustrated on a set of standard numerical experiments. Using the same developments, we further provide a version of the accelerated proximal hybrid extragradient method of Monteiro and Svaiter (2013) possibly exploiting strong convexity of the objective function.
Document type :
Journal articles
Complete list of metadata

https://hal.inria.fr/hal-03377374
Contributor : Adrien Taylor Connect in order to contact the contributor
Submitted on : Thursday, October 14, 2021 - 10:35:15 AM
Last modification on : Tuesday, October 25, 2022 - 4:20:05 PM

Links full text

Identifiers

Collections

Citation

Mathieu Barré, Adrien Taylor, Francis Bach. A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives. Open Journal of Mathematical Optimization, 2022, ⟨10.5802/ojmo.12⟩. ⟨hal-03377374⟩

Share

Metrics

Record views

28