Locally Weighted Learning, Artificial Intelligence Review, vol.11, issue.1, pp.11-73, 1997. ,
DOI : 10.1007/978-94-017-2053-3_2
An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants, Machine Learning, vol.36, issue.1/2, pp.105-139, 1999. ,
DOI : 10.1023/A:1007515423169
Bagging predictors, Machine Learning, vol.10, issue.2, pp.123-140, 1996. ,
DOI : 10.2307/1403680
Random Forests, Machine Learning, vol.45, issue.1, pp.5-32, 2001. ,
DOI : 10.1023/A:1010933404324
Classification and regression trees, 1993. ,
Statistical Comparisons of Classifiers over Multiple Data Sets, J. Mach. Learn. Res, vol.7, pp.1-30, 2006. ,
On a Monotonicity Problem in Step-Down Multiple Test Procedures, Journal of the American Statistical Association, vol.88, issue.423, pp.920-923, 1993. ,
DOI : 10.2307/1266545
others: Experiments with a new boosting algorithm, pp.148-156, 1996. ,
The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance, Journal of the American Statistical Association, vol.32, issue.200, p.675, 1937. ,
DOI : 10.1080/01621459.1937.10503522
Boosting instance selection algorithms. Knowledge-Based Systems, pp.342-360, 2014. ,
Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.34, issue.3, pp.417-435, 2012. ,
DOI : 10.1109/TPAMI.2011.142
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.459.7736
Induction of One-Level Decision Trees, Proceedings of the Ninth International Workshop on Machine Learning. pp. 233?240. ML '92, 1992. ,
DOI : 10.1016/B978-1-55860-247-2.50035-8
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.23.2878
A Mathematically Rigorous Foundation for Supervised Learning, Multiple Classifier Systems, pp.67-76, 2000. ,
DOI : 10.1007/3-540-45014-9_6
Local Boosting of Decision Stumps for Regression and Classification Problems, Journal of Computers, vol.1, issue.4, 2006. ,
DOI : 10.4304/jcp.1.4.30-37
A two-step rejection procedure for testing multiple hypotheses, Journal of Statistical Planning and Inference, vol.138, issue.6, pp.1521-1527, 2008. ,
A direct boosting algorithm for the k-nearest neighbor classifier via local warping of the distance metric, Pattern Recognition Letters, vol.33, issue.1, pp.92-102, 2012. ,
DOI : 10.1016/j.patrec.2011.09.028
A review of instance selection methods, Artificial Intelligence Review, vol.35, issue.9, pp.133-143, 2010. ,
DOI : 10.1007/978-1-4757-2440-0
Scikit-learn: Machine Learning in Python, J. Mach. Learn. Res, vol.12, pp.2825-2830, 2011. ,
URL : https://hal.archives-ouvertes.fr/hal-00650905
Induction of decision trees, Machine Learning, vol.1, issue.1, pp.81-106, 1986. ,
DOI : 10.1037/13135-000
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.167.3624
Ensemble-based classifiers, Artificial Intelligence Review, vol.13, issue.4, pp.1-39, 2010. ,
DOI : 10.1142/5686
Noise reduction for instance-based learning with a local maximal margin approach, Journal of Intelligent Information Systems, vol.38, issue.3, pp.301-331, 2010. ,
DOI : 10.1023/A:1007626913721
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.623.2491
Statistical learning theory Adaptive and learning systems for signal processing, communications, and control, 1998. ,
Asymptotic Properties of Nearest Neighbor Rules Using Edited Data, IEEE Transactions on Systems, Man, and Cybernetics, vol.2, issue.3, pp.408-421, 1972. ,
DOI : 10.1109/TSMC.1972.4309137
Data mining, ACM SIGMOD Record, vol.31, issue.1, 2011. ,
DOI : 10.1145/507338.507355
A local boosting algorithm for solving classification problems, Computational Statistics & Data Analysis, vol.52, issue.4, pp.1928-1941, 2008. ,
DOI : 10.1016/j.csda.2007.06.015