z-logo
Premium
Feature selection for high‐dimensional regression via sparse LSSVR based on L p ‐norm
Author(s) -
Li ChunNa,
Shao YuanHai,
Zhao Da,
Guo YanRu,
Hua XiangYu
Publication year - 2021
Publication title -
international journal of intelligent systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.291
H-Index - 87
eISSN - 1098-111X
pISSN - 0884-8173
DOI - 10.1002/int.22334
Subject(s) - norm (philosophy) , feature selection , support vector machine , regression , computer science , artificial intelligence , selection (genetic algorithm) , mathematics , pattern recognition (psychology) , mathematical optimization , algorithm , machine learning , statistics , political science , law
Abstract When solving many regression problems, there exist a large number of input features. However, not all features are relevant for current regression, and sometimes, including irrelevant features may deteriorate the learning performance. Therefore, it is essential to select the most relevant features, especially for high‐dimensional regression. Feature selection is an effective way to solve this problem. It tries to represent original data by extracting relevant features that contain useful information. In this paper, aiming to effectively select useful features in least squares support vector regression (LSSVR), we propose a novel sparse LSSVR based on L p ‐norm (SLSSVR), 0 < p ≤ 1 . Different from the existing L 1 ‐norm LSSVR ( L 1 ‐LSSVR) and L p ‐norm LSSVR ( L p ‐LSSVR), SLSSVR uses a smooth approximation of the nonsmooth nonconvex L p ‐norm term along with an effective solving algorithm. The proposed algorithm avoids the singularity issue that may encounter in L p ‐LSSVR, and its convergency is also guaranteed. Experimental results support the effectiveness of SLSSVR on both feature selection ability and regression performance.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here