z-logo
Premium
Chance classifications by non‐parametric linear discriminant functions
Author(s) -
Lavine B. K.,
Jurs P. C.,
Henry D. R.
Publication year - 1988
Publication title -
journal of chemometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.47
H-Index - 92
eISSN - 1099-128X
pISSN - 0886-9383
DOI - 10.1002/cem.1180020103
Subject(s) - parametric statistics , linear discriminant analysis , covariance , curse of dimensionality , pattern recognition (psychology) , discriminant function analysis , mathematics , artificial intelligence , computer science , monte carlo method , data mining , statistics , machine learning
In applications of pattern recognition techniques to problems in chemical fingerprinting, only limited knowledge about the underlying statistical distribution of the data is generally available. This means that non‐parametric methods must be used. Non‐parametric discriminant functions have been used to provide insight into relationships contained within sets of chemical measurements. However, classification based on random or chance separation can be a serious problem. Monte Carlo simulation studies have been carried out to assess the probability of chance classification for non‐parametric linear discriminants. The level of expected chance classification is a function of the number of observations (the number of samples), the dimensionality of the problem (the number of independent variables per observation), class membership distribution and the covariance structure of the data being examined. An approach for assessing the level of significance of classification scores obtained from real training sets will be presented. These simulation studies establish limits on the approaches that can be taken with real data sets so that chance classification are improbable, and provide information necessary for integrating the data analysis into the overall experimental design.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here