Premium
Online Classification with Varying Gaussians
Author(s) -
Hu Ting,
Zhou DingXuan
Publication year - 2010
Publication title -
studies in applied mathematics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.164
H-Index - 46
eISSN - 1467-9590
pISSN - 0022-2526
DOI - 10.1111/j.1467-9590.2009.00462.x
Subject(s) - smoothness , gaussian , gaussian function , gaussian process , mathematics , kernel (algebra) , range (aeronautics) , function (biology) , regularization (linguistics) , algorithm , artificial intelligence , computer science , machine learning , pattern recognition (psychology) , mathematical analysis , discrete mathematics , physics , materials science , quantum mechanics , evolutionary biology , biology , composite material
Gaussians are important tools for learning from data of large dimensions. The variance of a Gaussian kernel is a measurement of the frequency range of function components or features retrieved by learning algorithms induced by the Gaussian. The learning ability and approximation power increase when the variance of the Gaussian decreases. Thus, it is natural to use Gaussians with decreasing variances for online algorithms when samples are imposed one by one. In this paper, we consider fully online classification algorithms associated with a general loss function and varying Gaussians which are closely related to regularization schemes in reproducing kernel Hilbert spaces. Learning rates are derived in terms of the smoothness of a target function associated with the probability measure controlling sampling and the loss function. A critical estimate is given for the norm of the difference of regularized target functions as the variance of the Gaussian changes. Concrete learning rates are presented for the online learning algorithm with the least square loss function.