Skip to Main content Skip to Navigation
Reports

A Simple Unifying Theory of Multi-Class Support Vector Machines

Yann Guermeur 1
1 MODBIO - Computational models in molecular biology
INRIA Lorraine, LORIA - Laboratoire Lorrain de Recherche en Informatique et ses Applications
Abstract : Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real-valued functions). Only in recent years has multi-class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution-free uniform strong laws of large numbers devoted to multi-class large margin discriminant models. This technical report deals with the computation of the capacity measures involved in these bounds on the expected risk. Straightforward extensions of results regarding large margin classifiers highlight the central role played by a new generalized VC dimension, which can be seen either as an extension of the fat-shattering dimension to the multivariate case, or as a scale-sensitive version of the graph dimension. The theorems derived are applied to the architecture shared by all the multi-class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.
Document type :
Reports
Complete list of metadata

https://hal.inria.fr/inria-00071916
Contributor : Rapport de Recherche Inria <>
Submitted on : Tuesday, May 23, 2006 - 7:14:23 PM
Last modification on : Friday, February 26, 2021 - 3:28:05 PM
Long-term archiving on: : Sunday, April 4, 2010 - 10:44:25 PM

Identifiers

  • HAL Id : inria-00071916, version 1

Collections

Citation

Yann Guermeur. A Simple Unifying Theory of Multi-Class Support Vector Machines. [Research Report] RR-4669, INRIA. 2002. ⟨inria-00071916⟩

Share

Metrics

Record views

151

Files downloads

213