z-logo
Premium
Bayesian decision rules to classification problems
Author(s) -
Long Yuqi,
Xu Xingzhong
Publication year - 2021
Publication title -
australian and new zealand journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 1369-1473
DOI - 10.1111/anzs.12325
Subject(s) - bayes' theorem , mathematics , bayes' rule , decision rule , naive bayes classifier , bayesian probability , classification rule , artificial intelligence , bayes classifier , machine learning , parametric statistics , bayes estimator , sample size determination , bayes factor , pattern recognition (psychology) , statistics , computer science , support vector machine
Summary In this paper, we analysed classification rules under Bayesian decision theory. The setup we considered here is fairly general, which can represent all possible parametric models. The Bayes classification rule we investigated minimises the Bayes risk under general loss functions. Among the existing literatures, the 0‐1 loss function appears most frequently, under which the Bayes classification rule is determined by the posterior predictive densities. Theoretically, we extended the Bernstein–von Mises theorem to the multiple‐sample case. On this basis, the oracle property of Bayes classification rule has been discussed in detail, which refers to the convergence of the Bayes classification rule to the one built from the true distributions, as the sample size tends to infinity. Simulations show that the Bayes classification rules do have some advantages over the traditional classifiers, especially when the number of features approaches the sample size.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here