z-logo
open-access-imgOpen Access
A New Support Vector Compression Method Based on Singular Value Decomposition
Author(s) -
Yoon SangHun,
Lyuh ChunGi,
Chun IkJae,
Suk JungHee,
Roh Tae Moon
Publication year - 2011
Publication title -
etri journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.295
H-Index - 46
eISSN - 2233-7326
pISSN - 1225-6463
DOI - 10.4218/etrij.11.0210.0349
Subject(s) - support vector machine , singular value decomposition , dimension (graph theory) , compression (physics) , computer science , norm (philosophy) , algorithm , basis (linear algebra) , pattern recognition (psychology) , decomposition , artificial intelligence , mathematics , materials science , geometry , political science , pure mathematics , law , composite material , ecology , biology
In this letter, we propose a new compression method for a high dimensional support vector machine (SVM). We used singular value decomposition (SVD) to compress the norm part of a radial basis function SVM. By deleting the least significant vectors that are extracted from the decomposition, we can compress each vector with minimized energy loss. We select the compressed vector dimension according to the predefined threshold which can limit the energy loss to design criteria. We verified the proposed vector compressed SVM (VCSVM) for conventional datasets. Experimental results show that VCSVM can reduce computational complexity and memory by more than 40% without reduction in accuracy when classifying a 20,958 dimension dataset.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here