N. Anari, A. Shayan-oveis-gharan, and . Rezaei, Monte carlo markov chain algorithms for sampling strongly rayleigh distributions and determinantal point processes, 29th Annual Conference on Learning Theory, vol.49, pp.23-26, 2016.

R. Hafiz-affandi, A. Kulesza, E. Fox, and B. Taskar, Nystrom approximation for large-scale determinantal processes, Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, vol.31, pp.85-98, 2013.

J. David and . Aldous, The random walk construction of uniform spanning trees and uniform labelled trees, SIAM Journal on Discrete Mathematics, vol.3, issue.4, pp.450-465, 1990.

A. E. Alaoui and M. W. Mahoney, Fast randomized kernel ridge regression with statistical guarantees, Proceedings of the 28th International Conference on Neural Information Processing Systems, pp.775-783, 2015.

R. Bardenet, F. Lavancier, X. Mary, and A. Vasseur, On a few statistical applications of determinantal point processes, ESAIM: Procs, vol.60, pp.180-202, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01580353

K. Bringmann and K. Panagiotou, Efficient sampling methods for discrete distributions, International Colloquium on Automata, Languages, and Programming, pp.133-144, 2012.

P. Branden, Unimodality, log-concavity, real-rootedness and beyond. Handbook of Enumerative Combinatorics, vol.10, 2014.

A. Broder, Generating random spanning trees, Foundations of Computer Science, pp.442-447, 1989.

V. Brunel, Learning signed determinantal point processes through the principal minor assignment problem, Advances in Neural Information Processing Systems, vol.31, pp.7365-7374, 2018.

D. Burt, C. E. Rasmussen, M. Van-der, and . Wilk, Rates of convergence for sparse variational Gaussian process regression, Proceedings of the 36th International Conference on Machine Learning, vol.97, pp.9-15, 2019.

D. Calandriello, L. Carratino, A. Lazaric, M. Valko, and L. Rosasco, Gaussian process optimization with adaptive sketching: Scalable and no regret, Proceedings of the Thirty-Second Conference on Learning Theory, vol.99, pp.25-28, 2019.
URL : https://hal.archives-ouvertes.fr/hal-02144311

E. Celis, V. Keswani, D. Straszak, A. Deshpande, T. Kathuria et al., Fair and diverse DPP-based data summarization, Proceedings of the 35th International Conference on Machine Learning, vol.80, pp.10-15, 2018.

D. Calandriello, A. Lazaric, and M. Valko, Distributed adaptive sampling for kernel matrix approximation, AISTATS, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01482760

L. Chen, G. Zhang, and E. Zhou, Fast greedy map inference for determinantal point process to improve recommendation diversity, Advances in Neural Information Processing Systems, vol.31, pp.5622-5633, 2018.

N. John and . Darroch, On the distribution of the number of successes in independent trials, The Annals of Mathematical Statistics, vol.35, issue.3, pp.1317-1321, 1964.

M. Derezi?ski, Fast determinantal point processes via distortion-free intermediate sampling, Proceedings of the 32nd Conference on Learning Theory, 2019.

P. Diaconis and D. Stroock, Geometric Bounds for Eigenvalues of Markov Chains, The Annals of Applied Probability, 1991.

M. Derezi?ski and M. K. Warmuth, Unbiased estimates for linear regression via volume sampling, Advances in Neural Information Processing Systems, vol.30, pp.3087-3096, 2017.

M. Derezi?ski, M. K. Warmuth, and D. Hsu, Leveraged volume sampling for linear regression, Advances in Neural Information Processing Systems, vol.31, pp.2510-2519, 2018.

M. Derezi?ski, M. K. Warmuth, and D. Hsu, Correcting the bias in least squares regression with volume-rescaled sampling, Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, 2019.

A. Erraqabi, M. Valko, A. Carpentier, and O. Maillard, Pliable rejection sampling, International Conference on Machine Learning, 2016.
URL : https://hal.archives-ouvertes.fr/hal-01322168

G. Gautier, R. Bardenet, and M. Valko, Zonotope hit-and-run for efficient sampling from projection DPPs, International Conference on Machine Learning, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01526577

G. Gautier, R. Bardenet, and M. Valko, DPPy: Sampling determinantal point processes with Python, Machine Learning Open Source Software (JMLR-MLOSS), 2019.
URL : https://hal.archives-ouvertes.fr/hal-01879424

J. A. Gillenwater, Approximate inference for determinantal point processes, 2014.

J. Gillenwater, A. Kulesza, and B. Taskar, Discovering diverse and salient threads in document collections, Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, vol.12, pp.710-720, 2012.

A. Jennifer, A. Gillenwater, S. Kulesza, Z. E. Vassilvitskii, and . Mariet, Maximizing induced cardinality under a determinantal point process, Advances in Neural Information Processing Systems, vol.31, pp.6911-6920, 2018.

. Guenoche-;-wassily and . Hoeffding, On the distribution of the number of successes in independent trials, The Annals of Mathematical Statistics, vol.4, issue.3, pp.713-721, 1956.

]. Hkp-+-06, M. Hough, and . Krishnapur, Yuval Peres, Bálint Virág, vol.3, pp.206-229, 2006.

B. Kang, Fast determinantal point process sampling with application to clustering, Proceedings of the 26th International Conference on Neural Information Processing Systems, NIPS'13, pp.2319-2327, 2013.

A. Kulesza and B. Taskar, k-DPPs: Fixed-Size Determinantal Point Processes, Proceedings of the 28th International Conference on Machine Learning, pp.1193-1200, 2011.

A. Kulesza and B. Taskar, Determinantal Point Processes for Machine Learning, 2012.

G. Loosli, S. Canu, and L. Bottou, Training invariant support vector machines using selective sampling, Large Scale Kernel Machines, pp.301-320, 2007.

C. Launay, B. Galerne, and A. Desolneux, Exact Sampling of Determinantal Point Processes without Eigendecomposition. arXiv e-prints, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01710266

C. Li, S. Jegelka, and S. Sra, Efficient sampling for k-determinantal point processes, Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, vol.51, pp.9-11, 2016.

C. Li, S. Jegelka, and S. Sra, Fast mixing markov chains for strongly rayleigh measures, dpps, and constrained sampling, Proceedings of the 30th International Conference on Neural Information Processing Systems, NIPS'16, pp.4195-4203, 2016.

O. Macchi, The coincidence approach to stochastic point processes, Advances in Applied Probability, vol.7, issue.1, pp.83-122, 1975.

Z. E. Mariet and S. Sra, Kronecker determinantal point processes, Advances in Neural Information Processing Systems, vol.29, pp.2694-2702, 2016.

N. Metropolis and S. Ulam, The Monte Carlo method, Journal of the American Statistical Association, vol.44, issue.247, pp.335-341, 1949.

J. Poulson, High-performance sampling of generic determinantal point processes, ArXive:1905.00165v1, 2019.

R. Pemantle and Y. Peres, Concentration of lipschitz functionals of determinantal and other strong rayleigh measures, Combinatorics, Probability and Computing, vol.23, issue.1, pp.140-160, 2014.

J. and D. Wilson, How to get a perfectly random sample from a generic Markov chain and generate a random spanning tree of a directed graph, Journal of Algorithms, vol.27, issue.2, pp.170-217, 1998.

A. Rudi, D. Calandriello, L. Carratino, and L. Rosasco, On fast leverage score sampling and optimal learning, Advances in Neural Information Processing Systems, vol.31, pp.5672-5682, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01958879

P. Rebeschini and . Karbasi, Fast mixing for discrete point processes, Conference on Learning Theory, pp.1480-1500, 2015.

C. Zhang, H. Kjellström, and S. Mandt, Determinantal point processes for mini-batch diversification, 33rd Conference on Uncertainty in Artificial Intelligence, UAI 2017, 2017.

C. Zhang, C. Öztireli, S. Mandt, and G. Salvi, Active minibatch sampling using repulsive point processes, AAAI, 2019.