Skip to Main content Skip to Navigation
Conference papers

Adaptive Sampling Under Low Noise Conditions

Abstract : We survey recent results on efficient margin-based algorithms for adaptive sampling in binary classification tasks. Using the so-called Mammen-Tsybakov low noise condition to parametrize the distribution of covariates, and assuming linear label noise, we state bounds on the convergence rate of the adaptive sampler to the Bayes risk. These bounds show that, excluding logarithmic factors, the average risk converges to the Bayes risk at rate $N^{-(1+a)(2+a)/2(3+a)}$ where N denotes the number of queried labels, and a is the nonnegative exponent in the low noise condition. For all $a > \sqrt3-1$ this convergence rate is asymptotically faster than the rate $N^{-(1+a)/(2+a)}$ achieved by the fully supervised version of the base adaptive sampler, which queries all labels. Moreover, for a growing to infinity (hard margin condition) the gap between the semi- and fully-supervised rates becomes exponential.
Document type :
Conference papers
Complete list of metadata

Cited literature [11 references]  Display  Hide  Download

https://hal.inria.fr/inria-00386698
Contributor : Conférence Jds2009 <>
Submitted on : Friday, May 22, 2009 - 9:13:40 AM
Last modification on : Monday, May 25, 2009 - 7:01:59 AM
Long-term archiving on: : Thursday, June 10, 2010 - 11:38:13 PM

File

p128.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : inria-00386698, version 1

Collections

Citation

Nicolò Cesa-Bianchi. Adaptive Sampling Under Low Noise Conditions. 41èmes Journées de Statistique, SFdS, Bordeaux, 2009, Bordeaux, France, France. ⟨inria-00386698⟩

Share

Metrics

Record views

78

Files downloads

126