Premium
Misspecifying latent class models by mixture binomials
Author(s) -
Formann Anton K.
Publication year - 2001
Publication title -
british journal of mathematical and statistical psychology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.157
H-Index - 51
eISSN - 2044-8317
pISSN - 0007-1102
DOI - 10.1348/000711001159564
Subject(s) - overdispersion , latent class model , mixture model , mathematics , class (philosophy) , binomial distribution , constant (computer programming) , econometrics , statistics , negative binomial distribution , binomial (polynomial) , homogeneity (statistics) , computer science , poisson distribution , artificial intelligence , programming language
Four scenarios of homogeneity/heterogeneity with respect to the performance of the subjects and the task difficulties are considered: first, the unconstrained latent class model providing for heterogeneity with respect to both; second, the mixture binomial assuming constant task difficulty within each mixing component, but different levels of performance of the subjects; third, the model of independence which is equivalent to the one‐class latent class model allowing for different task difficulties but no variability of the subjects; and fourth, the binomial with success probability constant across tasks and subjects. It is shown that both over‐ and underdispersion may arise in latent class models of which the other three are special cases. As a consequence, the latent class model and the mixture binomial may generate nearly indistinguishable score distributions where overdispersion is present. So the score distribution is not always indicative of the lack of fit of the mixture binomial when in fact the latent class model is true. It may, therefore, be misleading to accept mixture binomials as well‐fitting models without having additionally assessed the fit of latent class models. This, however, is often the case in empirical research. A long series of investigations on Piaget's water‐level tasks serves as a good example.