z-logo
open-access-imgOpen Access
One-Class Support Vector Learning and Linear Matrix Inequalities
Author(s) -
Jooyoung Park,
Jinsung Kim,
Hansung Lee,
Daihee Park
Publication year - 2003
Publication title -
international journal of fuzzy logic and intelligent systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.296
H-Index - 9
eISSN - 2093-744X
pISSN - 1598-2645
DOI - 10.5391/ijfis.2003.3.1.100
Subject(s) - support vector machine , feature vector , kernel (algebra) , mathematics , matrix (chemical analysis) , class (philosophy) , linear programming , kernel method , ellipsoid , set (abstract data type) , artificial intelligence , feature (linguistics) , pattern recognition (psychology) , computer science , mathematical optimization , combinatorics , linguistics , philosophy , physics , composite material , programming language , materials science , astronomy
The SVDD(support vector data description) is one of the most well-known one-class support vector learning methods, in which one tries the strategy of utilizing balls defined on the kernel feature space in order to distinguish a set of normal data from all other possible abnormal objects. The major concern of this paper is to consider the problem of modifying the SVDD into the direction of utilizing ellipsoids instead of balls in order to enable better classification performance. After a brief review about the original SVDD method, this paper establishes a new method utilizing ellipsoids in feature space, and presents a solution in the form of SDP(semi-definite programming) which is an optimization problem based on linear matrix inequalities.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom