z-logo
open-access-imgOpen Access
An SGD-based meta-learner with “growing” descent
Author(s) -
Ilona M. Kulikovskikh,
С. А. Прохоров,
Tarzan Legović,
Tomislav Šmuc
Publication year - 2019
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1368/5/052008
Subject(s) - regret , convexity , stochastic gradient descent , rate of convergence , gradient descent , convergence (economics) , mathematical optimization , computer science , logistic function , function (biology) , population , mathematics , artificial intelligence , machine learning , computer network , channel (broadcasting) , demography , evolutionary biology , sociology , artificial neural network , financial economics , economics , biology , economic growth
The paper considers the problem of accelerating the convergence of stochastic gradient descent (SGD) in an automatic way. Previous research puts forward such algorithms as Adagrad, Adadelta, RMSprop, Adam and etc. to adapt both the updates and learning rates to the slope of a loss function. However, these adaptive methods do not share the same regret bound as the gradient descent method. Adagrad provably achieves the optimal regret bound on the assumption of convexity but accumulates the squared gradients in the denominator that dramatically shrinks the learning rate. This research is aimed at introducing a generalized logistic map directly into the SGD method in order to automatically set its parameters to the slope of the logistic loss function. The optimizer based on the population may be considered as a meta-learner that learns how to tune both the learning rate and gradient updates with respect to the rate of population growth. The present study yields the “growing” descent method and a series of computational experiments to point out the benefits of the proposed meta-learner.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here