J. Abernethy and E. Hazan, Faster Convex Optimization: Simulated Annealing with an Efficient Universal Barrier, Proceedings of The 33rd International Conference on Machine Learning, 2016.

D. Azagra and C. Mudarra, An Extension Theorem for convex functions of class C 1,1 on Hilbert spaces, Journal of Mathematical Analysis and Applications, vol.446, pp.1167-1182, 2017.

R. Adamczak, A. E. Litvak, A. Pajor, and N. Tomczak-jaegermann, Quantitative estimates of the convergence of the empirical covariance matrix in log-concave ensembles, Journal of the AMS, vol.23, issue.2, pp.535-561, 2010.
URL : https://hal.archives-ouvertes.fr/hal-00793769

R. Badenbroek and E. De-klerk, Complexity analysis of a sampling-based interior point method for convex optimization, 2018.

J. Bolte and E. , Pauwels Curiosities and counterexamples in smooth convex optimization, 2018.

S. Bubeck and R. Eldan, The entropic barrier: a simple and optimal universal self-concordant barrier, Conference on Learning Theory, pp.279-279, 2015.

J. P. Crouzeix, A relationship between the second derivatives of a convex function and of its conjugate, Mathematical Programming, vol.13, pp.364-365, 1977.

S. Cyrus, B. Hu, B. Van-scoy, and L. Lessard, A Robust Accelerated Optimization Algorithm for Strongly Convex Functions, Proceedings of the 2018 Annual American Control Conference (ACC), pp.1376-1381, 2018.

A. , Smooth optimization with approximate gradient, SIAM Journal on Optimization, vol.19, issue.3, pp.1171-1183, 2008.

O. Devolder, F. Glineur, and Y. Nesterov, First-order methods of smooth convex optimization with inexact oracle, Mathematical Programming, vol.146, issue.1-2, pp.37-75, 2014.

Y. Drori and M. Teboulle, An optimal variant of Kelley's cutting-plane method, Mathematical Programming, vol.160, issue.1-2, pp.321-351, 2016.

Y. Drori, On the Properties of Convex Functions over Open Sets, 2018.

Y. Drori and A. B. Taylor, Efficient first-order methods for convex minimization: a constructive approach, Mathematical Programming
URL : https://hal.archives-ouvertes.fr/hal-01902048

Y. Drori, Contributions to the Complexity Analysis of Optimization Algorithms, 2014.

Y. Drori and M. Teboulle, Performance of first-order methods for smooth convex minimization: a novel approach, Mathematical Programming, vol.145, issue.1-2, pp.451-482, 2014.

G. Gu and J. Yang, Optimal nonergodic sublinear convergence rate of proximal point algorithm for maximal monotone inclusion problems, 2019.

G. Gu and J. Yang, On the optimal ergodic sublinear convergence rate of the relaxed proximal point algorithm for variational inequalities, 2019.

A. T. Kalai and S. Vempala, Simulated annealing for convex optimization, Mathematics of Operations Research, vol.31, issue.2, pp.253-266, 2006.

D. Kim and J. F. Fessler, Optimized first-order methods for smooth convex minimization, Mathematical Programming, vol.159, issue.1-2, pp.81-107, 2016.

D. Kim and J. A. Fessler, Optimizing the efficiency of first-order methods for decreasing the gradient of smooth convex functions, 2018.

D. Kim, Accelerated proximal point method for maximally monotone operators, 2019.

E. De-klerk, F. Glineur, and A. B. Taylor, On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions, Optimization Letters, vol.11, issue.7, pp.1185-1199, 2017.

L. Lessard, B. Recht, and A. Packard, Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints, SIAM Journal on Optimization, vol.26, issue.1, pp.57-95, 2016.

J. Li, M. S. Andersen, and L. Vandenberghe, Inexact proximal Newton methods for self-concordant functions, Mathematical Methods of Operations Research, vol.85, pp.19-41, 2017.

F. Lieder, On the convergence rate of the Halpern-iteration, 2017.

L. Lovasz and S. Vempala, The geometry of logconcave functions and sampling algorithms, Random Structures & Algorithms, vol.30, issue.3, pp.307-358, 2007.

. Yu and . Nesterov, Lectures on convex optimization, vol.137, 2018.

Y. Nesterov and A. S. Nemirovski, Interior point polynomial algorithms in convex programming, 1994.

B. T. Polyak, Convergence of methods of feasible directions in extremal problems, USSR Computational Mathematics and Mathematical Physics, vol.11, issue.4, pp.53-70, 1971.

J. Renegar, A Mathematical View of Interior-Point Methods in Convex Optimization, 2001.

E. K. Ryu, A. B. Taylor, C. Bergeling, and P. Giselsson, Operator splitting performance estimation: Tight contraction factors and optimal parameter selection, 2018.
URL : https://hal.archives-ouvertes.fr/hal-02956361

R. Smith, Efficient Monte Carlo procedures for generating points uniformly distributed over bounded regions, Operations Research, vol.32, issue.6, pp.1296-1308, 1984.

M. Schmidt, N. L. Roux, and F. Bach, Convergence rates of inexact proximal-gradient methods for convex optimization, Advances in neural information processing systems, pp.1458-1466, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00618152

A. B. Taylor, J. M. Hendrickx, and F. Glineur, Exact worst-case convergence rates of the proximal gradient method for composite convex minimization, Journal of Optimization Theory and Applications, vol.178, issue.2, pp.455-476, 2018.

A. B. Taylor, J. M. Hendrickx, and F. Glineur, Smooth strongly convex interpolation and exact worst-case performance of first-order methods, Mathematical Programming, pp.307-345, 2017.

A. B. Taylor, J. M. Hendrickx, and F. Glineur, Exact worst-case performance of first-order methods for composite convex optimization, SIAM Journal on Optimization, vol.27, issue.3, pp.1283-1313, 2017.

A. Taylor, B. Van-scoy, and L. Lessard, Lyapunov functions for first-order methods: Tight automated convergence guarantees, Proceedings of the 35th International Conference on Machine Learning (ICML), pp.4897-4906, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01902068

B. Van-scoy, R. A. Freeman, and K. M. Lynch, The fastest known globally convergent first-order method for minimizing strongly convex functions, IEEE Control Systems Letters, vol.2, issue.1, pp.49-54, 2018.