z-logo
Premium
Generalized Additive Modeling with Implicit Variable Selection by Likelihood‐Based Boosting
Author(s) -
Tutz Gerhard,
Binder Harald
Publication year - 2006
Publication title -
biometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.298
H-Index - 130
eISSN - 1541-0420
pISSN - 0006-341X
DOI - 10.1111/j.1541-0420.2006.00578.x
Subject(s) - boosting (machine learning) , feature selection , generalized additive model , econometrics , selection (genetic algorithm) , computer science , statistics , mathematics , machine learning
Summary The use of generalized additive models in statistical data analysis suffers from the restriction to few explanatory variables and the problems of selection of smoothing parameters. Generalized additive model boosting circumvents these problems by means of stagewise fitting of weak learners. A fitting procedure is derived which works for all simple exponential family distributions, including binomial, Poisson, and normal response variables. The procedure combines the selection of variables and the determination of the appropriate amount of smoothing. Penalized regression splines and the newly introduced penalized stumps are considered as weak learners. Estimates of standard deviations and stopping criteria, which are notorious problems in iterative procedures, are based on an approximate hat matrix. The method is shown to be a strong competitor to common procedures for the fitting of generalized additive models. In particular, in high‐dimensional settings with many nuisance predictor variables it performs very well.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here