z-logo
open-access-imgOpen Access
Robust and SparseLP-Norm Support Vector Regression
Author(s) -
Ya-Fen Ye,
Chao Ying,
YuanHai Shao,
ChunNa Li,
Yujuan Chen
Publication year - 2017
Publication title -
journal of advanced computational intelligence and intelligent informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 20
eISSN - 1343-0130
pISSN - 1883-8014
DOI - 10.20965/jaciii.2017.p0989
Subject(s) - outlier , robustness (evolution) , feature selection , computer science , support vector machine , norm (philosophy) , robust regression , regression , pattern recognition (psychology) , artificial intelligence , nonlinear system , nonlinear regression , regression analysis , algorithm , mathematics , machine learning , statistics , biochemistry , chemistry , physics , quantum mechanics , political science , law , gene
A robust and sparse L p -norm support vector regression ( L p -RSVR) is proposed in this paper. The implementation of feature selection in our L p -RSVR not only preserves the performance of regression but also improves its robustness. The main characteristics of L p -RSVR are as follows: (i) By using the absolute constraint, L p -RSVR performs robustly against outliers. (ii) L p -RSVR ensures that useful features are selected based on theoretical analysis. (iii) Based on the feature-selection results, nonlinear L p -RSVR can be used when data is structurally nonlinear. Experimental results demonstrate the superiorities of the proposed L p -RSVR in both feature selection and regression performance as well as its robustness.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom