z-logo
open-access-imgOpen Access
New SV selection strategy and local–global regularisation method for improving online SVM learning
Author(s) -
Tang Tinglong,
Chen Shengyong,
Luo Jake
Publication year - 2018
Publication title -
electronics letters
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.375
H-Index - 146
ISSN - 1350-911X
DOI - 10.1049/el.2018.0765
Subject(s) - support vector machine , computer science , artificial intelligence , selection (genetic algorithm) , machine learning , process (computing) , sample (material) , quality (philosophy) , pattern recognition (psychology) , data mining , philosophy , chemistry , epistemology , chromatography , operating system
During the online learning process of support vector machines (SVMs), when a newly added sample is violating the Karush–Kuhn–Tucker) conditions, the new sample should be a new SV and transfer the old samples between the SVs and the non‐SVs. Normally, the performance of an SVM model is decided by the SVs, and the model should be updated by the newly added SVs; therefore, the selection of high‐quality candidate SVs will lead to a better learning accuracy, whereas low‐quality candidate SVs may result in low learning efficiency and unnecessary updating. A new strategy is proposed to select the candidate SVs. SVs are selected according to two new criteria: the importance and the informativeness criteria. Furthermore, a mixed local–global regularisation method is applied during the online learning process to improve the penalty coefficients. Experiment results show that the proposed algorithm can achieve a better performance with a faster speed and a higher accuracy when compared with traditional methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here