Premium
Quasi‐maximum likelihood and the kernel block bootstrap for nonlinear dynamic models
Author(s) -
Parente Paulo M. D. C.,
Smith Richard J.
Publication year - 2021
Publication title -
journal of time series analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.576
H-Index - 54
eISSN - 1467-9892
pISSN - 0143-9782
DOI - 10.1111/jtsa.12573
Subject(s) - mathematics , estimator , kernel (algebra) , quasi likelihood , consistency (knowledge bases) , asymptotic distribution , statistics , block (permutation group theory) , strong consistency , combinatorics , count data , discrete mathematics , poisson distribution
This article applies a novel bootstrap method, the kernel block bootstrap (KBB), to quasi‐maximum likelihood (QML) estimation of dynamic models with stationary strong mixing data. The method first kernel weights the components comprising the quasi‐log likelihood function in an appropriate way and then samples the resultant transformed components using the standard ‘ m out of n ’ bootstrap. We investigate the first‐order asymptotic properties of the KBB method for QML demonstrating, in particular, its consistency and the first‐order asymptotic validity of the bootstrap approximation to the distribution of the QML estimator. A set of simulation experiments for the mean regression model illustrates the efficacy of the kernel block bootstrap for QML estimation.