Skip to Main content Skip to Navigation
Journal articles

Bounds on the Risk for M-SVMs

Yann Guermeur 1 André Elisseeff 2 Dominique Zelus
1 MODBIO - Computational models in molecular biology
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real-valued functions). Only in recent years has multi-class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution-free uniform strong laws of large numbers devoted to multi-class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi-class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.
Document type :
Journal articles
Complete list of metadata

https://hal.inria.fr/inria-00099587
Contributor : Publications Loria <>
Submitted on : Tuesday, September 26, 2006 - 9:39:00 AM
Last modification on : Friday, February 26, 2021 - 3:28:05 PM

Identifiers

  • HAL Id : inria-00099587, version 1

Collections

Citation

Yann Guermeur, André Elisseeff, Dominique Zelus. Bounds on the Risk for M-SVMs. Applied Stochastic Models in Business and Industry, Wiley, 2003. ⟨inria-00099587⟩

Share

Metrics

Record views

166