Premium
Asymptotic Optimality of Sparse Linear Discriminant Analysis with Arbitrary Number of Classes
Author(s) -
Luo Ruiyan,
Qi Xin
Publication year - 2017
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/sjos.12267
Subject(s) - mathematics , linear discriminant analysis , curse of dimensionality , boundary (topology) , asymptotic analysis , convergence (economics) , infinity , discriminant , mathematical optimization , artificial intelligence , statistics , mathematical analysis , computer science , economics , economic growth
Many sparse linear discriminant analysis (LDA) methods have been proposed to overcome the major problems of the classic LDA in high‐dimensional settings. However, the asymptotic optimality results are limited to the case with only two classes. When there are more than two classes, the classification boundary is complicated and no explicit formulas for the classification errors exist. We consider the asymptotic optimality in the high‐dimensional settings for a large family of linear classification rules with arbitrary number of classes. Our main theorem provides easy‐to‐check criteria for the asymptotic optimality of a general classification rule in this family as dimensionality and sample size both go to infinity and the number of classes is arbitrary. We establish the corresponding convergence rates. The general theory is applied to the classic LDA and the extensions of two recently proposed sparse LDA methods to obtain the asymptotic optimality.