z-logo
open-access-imgOpen Access
Error Bounds for lp‐Norm Multiple Kernel Learning with Least Square Loss
Author(s) -
Shaogao Lv,
Jin-De Zhu
Publication year - 2012
Publication title -
abstract and applied analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.228
H-Index - 56
eISSN - 1687-0409
pISSN - 1085-3375
DOI - 10.1155/2012/915920
Subject(s) - mathematics , multiple kernel learning , kernel (algebra) , norm (philosophy) , kernel method , radial basis function kernel , mathematical optimization , artificial intelligence , combinatorics , computer science , support vector machine , political science , law
The problem of learning the kernel function with linear combinations of multiple kernels has attracted considerable attention recently in machine learning. Specially, by imposing an lp-norm penalty on the kernel combination coefficient, multiple kernel learning (MKL) was proved useful and effective for theoretical analysis and practical applications (Kloft et al., 2009, 2011). In this paper, we present a theoretical analysis on the approximation error and learning ability of the lp-norm MKL. Our analysis shows explicit learning rates for lp-norm MKL and demonstrates some notable advantages compared with traditional kernel-based learning algorithms where the kernel is fixed

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom