Premium
New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
Author(s) -
Zhang Chunming,
Jiang Yuan,
Shang Zuofeng
Publication year - 2009
Publication title -
canadian journal of statistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.804
H-Index - 51
eISSN - 1708-945X
pISSN - 0319-5724
DOI - 10.1002/cjs.10005
Subject(s) - nonparametric statistics , mathematics , divergence (linguistics) , estimator , parametric statistics , nonparametric regression , asymptotic distribution , consistency (knowledge bases) , semiparametric regression , bregman divergence , regression analysis , statistics , covariance , regression , function (biology) , econometrics , philosophy , linguistics , geometry , evolutionary biology , biology
In statistical learning, regression and classification concern different types of the output variables, and the predictive accuracy is quantified by different loss functions. This article explores new aspects of Bregman divergence (BD), a notion which unifies nearly all of the commonly used loss functions in regression and classification. The authors investigate the duality between BD and its generating function. They further establish, under the framework of BD, asymptotic consistency and normality of parametric and nonparametric regression estimators, derive the lower bound of their asymptotic covariance matrices, and demonstrate the role that parametric and nonparametric regression estimation play in the performance of classification procedures and related machine learning techniques. These theoretical results and new numerical evidence show that the choice of loss function affects estimation procedures, whereas has an asymptotically relatively negligible impact on classification performance. Applications of BD to statistical model building and selection with non‐Gaussian responses are also illustrated. The Canadian Journal of Statistics 37: 119‐139; 2009 © 2009 Statistical Society of Canada