F. Bach, On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions, Journal of Machine Learning Research, pp.1-38, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01118276

R. Baraniuk, M. Davenport, A. Ronald, M. Devore, and . Wakin, A simple proof of the restricted isometry property for random matrices. Constructive Approximation, pp.253-263, 2008.

A. Beck and Y. C. Eldar, Sparsity Constrained Nonlinear Optimization: Optimality Conditions and Algorithms, SIAM Journal on Optimization, vol.23, issue.3, pp.1480-1509, 2013.
DOI : 10.1137/120869778

URL : http://arxiv.org/pdf/1203.4580

T. Blumensath, Compressed sensing with nonlinear observations and related nonlinear optimization problems. Information Theory, IEEE Transactions on, pp.1-19, 2013.
DOI : 10.1109/tit.2013.2245716

URL : https://eprints.soton.ac.uk/164753/1/Submission_final.pdf

T. Blumensath and M. E. Davies, Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces, IEEE Transactions on Information Theory, vol.55, issue.4, pp.1872-1882, 2009.
DOI : 10.1109/TIT.2009.2013003

T. Petros, . Boufounos, G. Richard, and . Baraniuk, 1-Bit Compressive Sensing, Information Sciences and Systems, pp.16-21, 2008.

T. Petros, S. Boufounos, H. Rane, and . Mansour, Representation and Coding of Signal Geometry, Journal of the IMA, vol.6, issue.4, pp.349-388, 2017.

A. Bourrier, M. E. Davies, T. Peleg, and R. Gribonval, Fundamental Performance Limits for Ideal Decoders in High-Dimensional Linear Inverse Problems, IEEE Transactions on Information Theory, vol.60, issue.12, pp.7928-7946, 2014.
DOI : 10.1109/TIT.2014.2364403

URL : https://hal.archives-ouvertes.fr/hal-00908358

J. Emmanuel and . Candès, The restricted isometry property and its implications for compressed sensing, Comptes Rendus Mathematique, vol.346, issue.910, pp.1-4, 2008.

E. J. Candès, J. K. Romberg, and T. Tao, Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information, IEEE Transactions on Information Theory, vol.52, issue.2, pp.480-509, 2006.
DOI : 10.1109/TIT.2005.862083

J. Emmanuel, T. Candès, and . Tao, Decoding by linear programming, IEEE Transactions on Information Theory, vol.51, issue.12, pp.4203-4215, 2005.

A. Cohen, W. Dahmen, A. Ronald, and . Devore, Compressed sensing and best $k$-term approximation, Journal of the American Mathematical Society, vol.22, issue.1, pp.211-231, 2009.
DOI : 10.1090/S0894-0347-08-00610-3

URL : http://www.igpm.rwth-aachen.de/Download/reports/pdf/IGPM260.pdf

L. David and . Donoho, Compressed sensing, IEEE Transactions on Information Theory, vol.52, issue.4, pp.1289-1306, 2006.

W. Heinz, P. Engl, and . Kügler, Nonlinear inverse problems: theoretical aspects and some industrial applications. In Multidisciplinary Methods for Analysis Optimization and Control of Complex Systems, pp.3-47, 2005.

S. Foucart and H. Rauhut, A Mathematical Introduction to Compressive Sensing Applied and Numerical Harmonic Analysis

R. Giryes, G. Sapiro, and A. M. Bronstein, Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy?, IEEE Transactions on Signal Processing, vol.64, issue.13, pp.3444-3457, 2016.
DOI : 10.1109/TSP.2016.2546221

URL : http://arxiv.org/pdf/1504.08291

R. Gribonval, G. Blanchard, N. Keriven, and Y. Traonmilin, Compressive Statistical Learning with Random Feature Moments, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01544609

L. Jacques, Small width, low distortions: quasi-isometric embeddings with quantized sub-Gaussian random projections, pp.1-26, 2015.
DOI : 10.1109/tit.2017.2717583

URL : http://arxiv.org/pdf/1504.06170

N. Keriven, A. Bourrier, R. Gribonval, and P. Pérèz, Sketching for large-scale learning of mixture models, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp.1-62, 2017.
DOI : 10.1109/ICASSP.2016.7472867

URL : https://hal.archives-ouvertes.fr/hal-01329195

T. Peleg, R. Gribonval, and M. E. Davies, Compressed Sensing and Best Approximation from Unions of Subspaces: Beyond Dictionaries, European Signal Processing Conference (EUSIPCO), 2013.
URL : https://hal.archives-ouvertes.fr/hal-00812858

A. Rahimi and B. Recht, Random Features for Large Scale Kernel Machines, Advances in Neural Information Processing Systems (NIPS), 2007.

A. Rahimi and B. Recht, Weighted sums of random kitchen sinks: Replacing minimization with randomization in learning, Advances in Neural Information Processing Systems (NIPS), 2009.

A. Rudi and L. Rosasco, Generalization Properties of Learning with Random Features, Advances in Neural Information Processing System (NIPS), 2017.

Y. Traonmilin and R. Gribonval, Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all, Applied and Computational Harmonic Analysis, 2016.
DOI : 10.1016/j.acha.2016.08.004

URL : https://hal.archives-ouvertes.fr/hal-01207987