Skip to Main content Skip to Navigation
Conference papers

Boosting products of base classifiers

Balázs Kégl 1, 2, 3 Róbert Busa-Fekete 1
2 TAO - Machine Learning and Optimisation
CNRS - Centre National de la Recherche Scientifique : UMR8623, Inria Saclay - Ile de France, UP11 - Université Paris-Sud - Paris 11, LRI - Laboratoire de Recherche en Informatique
Abstract : In this paper we show how to boost products of simple base learners. Similarly to trees, we call the base learner as a subroutine but in an iterative rather than recursive fashion. The main advantage of the proposed method is its simplicity and computational efficiency. On benchmark datasets, our boosted products of decision stumps clearly outperform boosted trees, and on the MNIST dataset the algorithm achieves the second best result among no-domain-knowledge algorithms after deep belief nets. As a second contribution, we present an improved base learner for nominal features and show that boosting the product of two of these new subset indicator base learners solves the maximum margin matrix factorization problem used to formalize the collaborative filtering task. On a small benchmark dataset, we get experimental results comparable to the semi-definite-programming-based solution but at a much lower computational cost.
Document type :
Conference papers
Complete list of metadata

https://hal.inria.fr/inria-00428905
Contributor : Balázs Kégl <>
Submitted on : Thursday, October 29, 2009 - 7:00:33 PM
Last modification on : Thursday, July 8, 2021 - 3:49:23 AM

Identifiers

  • HAL Id : inria-00428905, version 1

Collections

Citation

Balázs Kégl, Róbert Busa-Fekete. Boosting products of base classifiers. 26th International Conference on Machine Learning (ICML 2009), Jun 2009, Montreal, Canada. pp.497-504. ⟨inria-00428905⟩

Share

Metrics

Record views

1174