z-logo
open-access-imgOpen Access
A Faster Gradient Ascent Learning Algorithm for Nonlinear SVM
Author(s) -
Cătălina Cocianu,
Luminiţa State,
Marinela Mircea,
Panagiotis Vlamos
Publication year - 2013
Publication title -
isrn applied mathematics
Language(s) - English
Resource type - Journals
eISSN - 2090-5572
pISSN - 2090-5564
DOI - 10.1155/2013/520635
Subject(s) - algorithm , support vector machine , convergence (economics) , generalization , artificial intelligence , computer science , heuristic , rate of convergence , nonlinear system , feature (linguistics) , gaussian , gradient descent , sequence (biology) , feature vector , pattern recognition (psychology) , machine learning , mathematics , key (lock) , artificial neural network , mathematical analysis , linguistics , philosophy , physics , computer security , quantum mechanics , biology , economics , genetics , economic growth
We propose a refined gradient ascent method including heuristic parameters for solving the dual problem of nonlinear SVM. Aiming to get better tuning to the particular training sequence, the proposed refinement consists of the use of heuristically established weights in correcting the search direction at each step of the learning algorithm that evolves in the feature space. We propose three variants for computing the correcting weights, their effectiveness being analyzed on experimental basis in the final part of the paper. The tests pointed out good convergence properties, and moreover, the proposed modified variants proved higher convergence rates as compared to Platt’s SMO algorithm. The experimental analysis aimed to derive conclusions on the recognition rate as well as on the generalization capacities. The learning phase of the SVM involved linearly separable samples randomly generated from Gaussian repartitions and the WINE and WDBC datasets. The generalization capacities in case of artificial data were evaluated by several tests performed on new linearly/nonlinearly separable data coming from the same classes. The tests pointed out high recognition rates (about 97%) on artificial datasets and even higher recognition rates in case of the WDBC dataset.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom