Learning Multiple Tasks with Boosted Decision Trees

Abstract : We address the problem of multi-task learning with no label correspondence among tasks. Learning multiple related tasks simultane- ously, by exploiting their shared knowledge can improve the predictive performance on every task. We develop the multi-task Adaboost en- vironment with Multi-Task Decision Trees as weak classifiers. We first adapt the well known decision tree learning to the multi-task setting. We revise the information gain rule for learning decision trees in the multi- task setting. We use this feature to develop a novel criterion for learning Multi-Task Decision Trees. The criterion guides the tree construction by learning the decision rules from data of different tasks, and representing different degrees of task relatedness. We then modify MT-Adaboost to combine Multi-task Decision Trees as weak learners. We experimentally validate the advantage of the new technique; we report results of ex- periments conducted on several multi-task datasets, including the Enron email set and Spam Filtering collection.
Type de document :
Communication dans un congrès
Lecture Note in Computer Science. ECML/PKDD - European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases - 2012, 2012, Bristol, United Kingdom. 2012
Liste complète des métadonnées

https://hal.inria.fr/hal-00727749
Contributeur : Rémi Gilleron <>
Soumis le : mardi 4 septembre 2012 - 12:06:15
Dernière modification le : jeudi 11 janvier 2018 - 06:22:13

Identifiants

  • HAL Id : hal-00727749, version 1

Collections

Citation

Jean Baptiste Faddoul, Boris Chidlovskii, Rémi Gilleron, Fabien Torre. Learning Multiple Tasks with Boosted Decision Trees. Lecture Note in Computer Science. ECML/PKDD - European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases - 2012, 2012, Bristol, United Kingdom. 2012. 〈hal-00727749〉

Partager

Métriques

Consultations de la notice

263