z-logo
Premium
Simple Transformation Techniques for Improved Non‐parametric Regression
Author(s) -
Park B. U.,
Kim W. C.,
Ruppert D.,
Jones M. C.,
Signorini D. F.,
Kohn R.
Publication year - 1997
Publication title -
scandinavian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.359
H-Index - 65
eISSN - 1467-9469
pISSN - 0303-6898
DOI - 10.1111/1467-9469.00055
Subject(s) - mathematics , kernel regression , estimator , parametric statistics , kernel smoother , kernel (algebra) , simple (philosophy) , smoothing , kernel method , regression , transformation (genetics) , regression analysis , statistics , local regression , nonparametric regression , polynomial regression , artificial intelligence , computer science , combinatorics , support vector machine , philosophy , biochemistry , chemistry , epistemology , gene , radial basis function kernel
We propose and investigate two new methods for achieving less bias in non‐ parametric regression. We show that the new methods have bias of order h 4 , where h is a smoothing parameter, in contrast to the basic kernel estimator’s order h 2 . The methods are conceptually very simple. At the first stage, perform an ordinary non‐parametric regression on { x i , Y i } to obtain m^ ( x i ) (we use local linear fitting). In the first method, at the second stage, repeat the non‐parametric regression but on the transformed dataset { m^ ( x i , Y i )}, taking the estimator at x to be this second stage estimator at m^ ( x ). In the second, and more appealing, method, again perform non‐parametric regression on { m^ ( x i , Y i )}, but this time make the kernel weights depend on the original x scale rather than using the m^ ( x ) scale. We concentrate more of our effort in this paper on the latter because of its advantages over the former. Our emphasis is largely theoretical, but we also show that the latter method has practical potential through some simulated examples.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here