z-logo
open-access-imgOpen Access
sKAdam: An improved scalar extension of KAdam for function optimization
Author(s) -
José David Camacho,
Carlos Villaseñor,
Alma Y. Alanís,
Carlos López-Franco,
Nancy AranaDaniel
Publication year - 2020
Publication title -
intelligent data analysis
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.231
H-Index - 47
eISSN - 1571-4128
pISSN - 1088-467X
DOI - 10.3233/ida-200010
Subject(s) - kalman filter , extension (predicate logic) , scalar (mathematics) , computer science , algorithm , exponential function , mathematical optimization , function (biology) , mathematics , artificial intelligence , mathematical analysis , evolutionary biology , biology , programming language , geometry
This paper presents an improved extension of the previous algorithm of the authors called KAdam that was proposed as a combination of a first-order gradient-based optimizer of stochastic functions, known as the Adam algorithm and the Kalman filter. In the extension presented here, it is proposed to filter each parameter of the objective function using a 1-D Kalman filter; this allows us to switch from matrix and vector calculations to scalar operations. Moreover, it is reduced the impact of the measurement noise factor from the Kalman filter by using an exponential decay in function of the number of epochs for the training. Therefore in this paper, is introduced our proposed method sKAdam, a straightforward improvement over the original algorithm. This extension of KAdam presents a reduced execution time, a reduced computational complexity, and better accuracy as well as keep the properties from Adam of being well suited for problems with large datasets and/or parameters, non-stationary objectives, noisy and/or sparse gradients.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here