Skip to Main content Skip to Navigation
Journal articles

Optimal classifiers fusion in a non-Bayesian probabilistic framework

Abstract : The combination of the output of classifiers has been one of the strategies used to improve classification rates in general purpose classification systems. Some of the most common approaches can be explained using the Bayes' formula. In this paper, we tackle the problem of the combination of classifiers using a non-Bayesian probabilistic framework. This approach permits us to derive two linear combination rules that minimize misclassification rates under some constraints on the distribution of classifiers. In order to show the validity of this approach we have compared it with other popular combination rules from a theoretical viewpoint using a synthetic data set, and experimentally using two standard databases: the MNIST handwritten digit database and the GREC symbol database. Results on the synthetic data set show the validity of the theoretical approach. Indeed, results on real data show that the proposed methods outperform other common combination schemes.
Document type :
Journal articles
Complete list of metadata

https://hal.inria.fr/inria-00434238
Contributor : Salvatore Tabbone <>
Submitted on : Friday, November 20, 2009 - 7:48:48 PM
Last modification on : Tuesday, May 18, 2021 - 3:32:05 PM

Identifiers

  • HAL Id : inria-00434238, version 1

Collections

Citation

Oriol Ramos-Terrades, Ernest Valveny, Salvatore Tabbone. Optimal classifiers fusion in a non-Bayesian probabilistic framework. IEEE Transactions on Pattern Analysis and Machine Intelligence, Institute of Electrical and Electronics Engineers, 2009, 31 (9), pp.1630-1644. ⟨inria-00434238⟩

Share

Metrics

Record views

209