z-logo
Premium
Generalised quasi‐likelihood inference in a semi‐parametric binary dynamic mixed logit model
Author(s) -
Zheng Nan,
Sutradhar Brajendra C.
Publication year - 2018
Publication title -
australian and new zealand journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 41
eISSN - 1467-842X
pISSN - 1369-1473
DOI - 10.1111/anzs.12235
Subject(s) - mathematics , estimator , parametric statistics , context (archaeology) , quasi likelihood , statistics , parametric model , binary data , inference , consistency (knowledge bases) , restricted maximum likelihood , count data , semiparametric model , random effects model , econometrics , estimation theory , binary number , computer science , artificial intelligence , paleontology , geometry , arithmetic , poisson distribution , biology , medicine , meta analysis
There exists a recent study where dynamic mixed‐effects regression models for count data have been extended to a semi‐parametric context. However, when one deals with other discrete data such as binary responses, the results based on count data models are not directly applicable. In this paper, we therefore begin with existing binary dynamic mixed models and generalise them to the semi‐parametric context. For inference, we use a new semi‐parametric conditional quasi‐likelihood (SCQL) approach for the estimation of the non‐parametric function involved in the semi‐parametric model, and a semi‐parametric generalised quasi‐likelihood (SGQL) approach for the estimation of the main regression, dynamic dependence and random effects variance parameters. A semi‐parametric maximum likelihood (SML) approach is also used as a comparison to the SGQL approach. The properties of the estimators are examined both asymptotically and empirically. More specifically, the consistency of the estimators is established and finite sample performances of the estimators are examined through an intensive simulation study.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here