z-logo
Premium
Intentionally biased bootstrap methods
Author(s) -
Hall P.,
Presnell B.
Publication year - 1999
Publication title -
journal of the royal statistical society: series b (statistical methodology)
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 6.523
H-Index - 137
eISSN - 1467-9868
pISSN - 1369-7412
DOI - 10.1111/1467-9868.00168
Subject(s) - estimator , nonparametric statistics , skewness , kurtosis , smoothing , outlier , mathematics , trimming , statistics , null hypothesis , variance (accounting) , statistical hypothesis testing , econometrics , computer science , accounting , business , operating system
A class of weighted bootstrap techniques, called biased bootstrap or b‐bootstrap methods, is introduced. It is motivated by the need to adjust empirical methods, such as the ‘uniform’ bootstrap, in a surgical way to alter some of their features while leaving others unchanged. Depending on the nature of the adjustment, the b‐bootstrap can be used to reduce bias, or to reduce variance or to render some characteristic equal to a predetermined quantity. Examples of the last application include a b‐bootstrap approach to hypothesis testing in nonparametric contexts, where the b‐bootstrap enables simulation ‘under the null hypothesis’, even when the hypothesis is false, and a b‐bootstrap competitor to Tibshirani's variance stabilization method. An example of the bias reduction application is adjustment of Nadaraya–Watson kernel estimators to make them competitive with local linear smoothing. Other applications include density estimation under constraints, outlier trimming, sensitivity analysis, skewness or kurtosis reduction and shrinkage.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here