Skip to Main content Skip to Navigation
Conference papers

Adaptive Sampling Under Low Noise Conditions

Abstract : We survey recent results on efficient margin-based algorithms for adaptive sampling in binary classification tasks. Using the so-called Mammen-Tsybakov low noise condition to parametrize the distribution of covariates, and assuming linear label noise, we state bounds on the convergence rate of the adaptive sampler to the Bayes risk. These bounds show that, excluding logarithmic factors, the average risk converges to the Bayes risk at rate $N^{-(1+a)(2+a)/2(3+a)}$ where N denotes the number of queried labels, and a is the nonnegative exponent in the low noise condition. For all $a > \sqrt3-1$ this convergence rate is asymptotically faster than the rate $N^{-(1+a)/(2+a)}$ achieved by the fully supervised version of the base adaptive sampler, which queries all labels. Moreover, for a growing to infinity (hard margin condition) the gap between the semi- and fully-supervised rates becomes exponential.
Document type :
Conference papers
Complete list of metadata

Cited literature [11 references]  Display  Hide  Download
Contributor : Conférence Jds2009 Connect in order to contact the contributor
Submitted on : Friday, May 22, 2009 - 9:13:40 AM
Last modification on : Wednesday, December 8, 2021 - 9:22:03 AM
Long-term archiving on: : Thursday, June 10, 2010 - 11:38:13 PM


Files produced by the author(s)


  • HAL Id : inria-00386698, version 1



Nicolò Cesa-Bianchi. Adaptive Sampling Under Low Noise Conditions. 41èmes Journées de Statistique, SFdS, Bordeaux, 2009, Bordeaux, France, France. ⟨inria-00386698⟩



Record views


Files downloads