z-logo
open-access-imgOpen Access
The Support Vector Regression with Adaptive Norms
Author(s) -
Chunhua Zhang,
Dewei Li,
Junyan Tan
Publication year - 2013
Publication title -
procedia computer science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.334
H-Index - 76
ISSN - 1877-0509
DOI - 10.1016/j.procs.2013.05.341
Subject(s) - computer science , hinge loss , support vector machine , norm (philosophy) , regression , penalty method , regression analysis , mathematical optimization , function (biology) , algorithm , artificial intelligence , machine learning , mathematics , statistics , evolutionary biology , political science , law , biology
This study proposes a new method for regression – lp-norm support vector regression (lp SVR). Some classical SVRs minimize the hinge loss function subject to the l2-norm or l1-norm penalty. These methods are non-adaptive since their penalty forms are fixed and pre-determined for any types of data. Our new model is an adaptive learning procedure with lp-norm (0 < p < 1), where the best p is automatically chosen by data. By adjusting the parameter p, lp SVR can not only select relevant features but also improve the regression accuracy. An iterative algorithm is suggested to solve the lp SVR efficiently. Simulations and real data applications support the effectiveness of the proposed procedure

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom