z-logo
open-access-imgOpen Access
Support vector machine classifiers by non-Euclidean margins
Author(s) -
Ying Lin,
Qi Ye
Publication year - 2020
Publication title -
mathematical foundations of computing
Language(s) - English
Resource type - Journals
ISSN - 2577-8838
DOI - 10.3934/mfc.2020018
Subject(s) - hinge loss , support vector machine , artificial intelligence , euclidean distance , norm (philosophy) , euclidean geometry , pattern recognition (psychology) , mathematics , margin (machine learning) , margin classifier , kernel (algebra) , machine learning , computer science , algorithm , discrete mathematics , geometry , political science , law
In this article, the classical support vector machine (SVM) classifiers are generalized by the non-Euclidean margins. We first extend the linear models of the SVM classifiers by the non-Euclidean margins including the theorems and algorithms of the SVM classifiers by the hard margins and the soft margins. Specially, the SVM classifiers by the \begin{document}$ \infty $\end{document} -norm margins can be solved by the 1-norm optimization with sparsity. Next, we show that the non-linear models of the SVM classifiers by the \begin{document}$ q $\end{document} -norm margins can be equivalently transferred to the SVM in the \begin{document}$ p $\end{document} -norm reproducing kernel Banach spaces given by the hinge loss, where \begin{document}$ 1/p+1/q = 1 $\end{document} . Finally, we illustrate the numerical examples of artificial data and real data to compare the different algorithms of the SVM classifiers by the \begin{document}$ \infty $\end{document} -norm margin.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom