Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods

Abstract : In view of the minimization of a nonsmooth nonconvex function f, we prove an abstract convergence result for descent methods satisfying a sufficient-decrease assumption, and allowing a relative error tolerance. Our result guarantees the convergence of bounded sequences, under the assumption that the function f satisfies the Kurdyka-Łojasiewicz inequality. This assumption allows to cover a wide range of problems, including nonsmooth semi-algebraic (or more generally tame) minimization. The specialization of our result to different kinds of structured problems provides several new convergence results for inexact versions of the gradient method, the proximal method, the forward-backward splitting algorithm, the gradient projection and some proximal regularization of the Gauss-Seidel method in a nonconvex setting. Our results are illustrated through feasibility problems, or iterative thresholding procedures for compressive sensing.
keyword : sadco
Type de document :
Article dans une revue
Mathematical Programming, Springer Verlag, 2011, 〈10.1007/s10107-011-0484-9〉
Liste complète des métadonnées

https://hal.inria.fr/inria-00636457
Contributeur : Estelle Bouzat <>
Soumis le : jeudi 27 octobre 2011 - 15:08:11
Dernière modification le : lundi 21 mars 2016 - 11:32:43

Lien texte intégral

Identifiants

Collections

Citation

Hedy Attouch, Jerome Bolte, Benar Fux Svaiter. Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods. Mathematical Programming, Springer Verlag, 2011, 〈10.1007/s10107-011-0484-9〉. 〈inria-00636457〉

Partager

Métriques

Consultations de la notice

182