z-logo
Premium
A comparative study of multi‐class support vector machines in the unifying framework of large margin classifiers
Author(s) -
Guermeur Yann,
Elisseeff André,
Zelus Dominique
Publication year - 2005
Publication title -
applied stochastic models in business and industry
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.413
H-Index - 40
eISSN - 1526-4025
pISSN - 1524-1904
DOI - 10.1002/asmb.534
Subject(s) - margin (machine learning) , support vector machine , discriminant , class (philosophy) , statistical learning theory , linear discriminant analysis , computer science , dimension (graph theory) , artificial intelligence , bounded function , machine learning , mathematics , vc dimension , pure mathematics , mathematical analysis
Vapnik's statistical learning theory has mainly been developed for two types of problems: pattern recognition (computation of dichotomies) and regression (estimation of real‐valued functions). Only in recent years has multi‐class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution‐free uniform strong laws of large numbers devoted to multi‐class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi‐class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines. Copyright © 2005 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom