https://hal.inria.fr/inria-00386698Cesa-Bianchi, NicolòNicolòCesa-BianchiDipartimento di Scienze dell'Informazione [Milano] - UNIMI - Università degli Studi di Milano [Milano]Adaptive Sampling Under Low Noise ConditionsHAL CCSD2009[MATH.MATH-ST] Mathematics [math]/Statistics [math.ST]Jds2009, Conférence2009-05-22 09:13:402021-12-08 09:22:032009-05-25 07:01:59frConference papersapplication/pdf1We survey recent results on efficient margin-based algorithms for adaptive sampling in binary classification tasks. Using the so-called Mammen-Tsybakov low noise condition to parametrize the distribution of covariates, and assuming linear label noise, we state bounds on the convergence rate of the adaptive sampler to the Bayes risk. These bounds show that, excluding logarithmic factors, the average risk converges to the Bayes risk at rate $N^{-(1+a)(2+a)/2(3+a)}$ where N denotes the number of queried labels, and a is the nonnegative exponent in the low noise condition. For all $a > \sqrt3-1$ this convergence rate is asymptotically faster than the rate $N^{-(1+a)/(2+a)}$ achieved by the fully supervised version of the base adaptive sampler, which queries all labels. Moreover, for a growing to infinity (hard margin condition) the gap between the semi- and fully-supervised rates becomes exponential.