Complexity Bounds for Batch Active Learning in Classification - Archive ouverte HAL Access content directly
Conference Papers Year : 2010

Complexity Bounds for Batch Active Learning in Classification

(1) , (1, 2)


Active learning is a branch of Machine Learning in which the learning algorithm, instead of being directly provided with pairs of problem instances and their solutions (their labels), is allowed to choose, from a set of unlabeled data, which instances to query. It is suited to settings where labeling instances is costly. This paper analyzes the speed-up of batch (parallel) active learning compared to sequential active learning (where instances are chosen 1 by 1): how faster can an algorithm become if it can query instances at once? There are two main contributions: proving lower and upper bounds on the possible gain, and illustrating them by experimenting on usual active learning algorithms. Roughly speaking, the speed-up is asymptotically logarithmic in the batch size (i.e. when ! 1). However, for some classes of functions with finite VC-dimension V , a linear speed-up can be achieved until a batch size of V . Practically speaking, this means that parallelizing computations on an expensive-to-label problem which is suited to active learning is very beneficial until V simultaneous queries, and less interesting (yet still bringing improvement) afterwards.
Fichier principal
Vignette du fichier
batchal.pdf (158.41 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

inria-00533318 , version 1 (05-11-2010)


  • HAL Id : inria-00533318 , version 1


Philippe Rolet, Olivier Teytaud. Complexity Bounds for Batch Active Learning in Classification. ECML 2010, Oct 2010, Barcelone, Spain. ⟨inria-00533318⟩
142 View
141 Download


Gmail Facebook Twitter LinkedIn More