z-logo
Premium
Another look at distance‐weighted discrimination
Author(s) -
Wang Boxiang,
Zou Hui
Publication year - 2018
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/rssb.12244
Subject(s) - support vector machine , kernel (algebra) , reproducing kernel hilbert space , computer science , kernel method , pattern recognition (psychology) , algorithm , computation , artificial intelligence , mathematics , machine learning , hilbert space , mathematical analysis , combinatorics
Summary Distance‐weighted discrimination (DWD) is a modern margin‐based classifier with an interesting geometric motivation. It was proposed as a competitor to the support vector machine (SVM). Despite many recent references on DWD, DWD is far less popular than the SVM, mainly because of computational and theoretical reasons. We greatly advance the current DWD methodology and its learning theory. We propose a novel thrifty algorithm for solving standard DWD and generalized DWD, and our algorithm can be several hundred times faster than the existing state of the art algorithm based on second‐order cone programming. In addition, we exploit the new algorithm to design an efficient scheme to tune generalized DWD. Furthermore, we formulate a natural kernel DWD approach in a reproducing kernel Hilbert space and then establish the Bayes risk consistency of the kernel DWD by using a universal kernel such as the Gaussian kernel. This result solves an open theoretical problem in the DWD literature. A comparison study on 16 benchmark data sets shows that data‐driven generalized DWD consistently delivers higher classification accuracy with less computation time than the SVM.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here