z-logo
open-access-imgOpen Access
Development of a hyperparameter optimization method for recommendatory models based on matrix factorization
Author(s) -
A. A. Nechaev,
Vasily Meltsov,
Dmitry Strabykin
Publication year - 2021
Publication title -
eastern-european journal of enterprise technologies
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.268
H-Index - 24
eISSN - 1729-4061
pISSN - 1729-3774
DOI - 10.15587/1729-4061.2021.239124
Subject(s) - hyperparameter , bayesian optimization , gaussian process , hyperparameter optimization , factorization , computer science , algorithm , matrix (chemical analysis) , bayesian probability , mathematics , surrogate model , mathematical optimization , gaussian , artificial intelligence , physics , materials science , quantum mechanics , support vector machine , composite material
Many advanced recommendatory models are implemented using matrix factorization algorithms. Experiments show that the quality of their performance depends significantly on the selected hyperparameters. Analysis of the effectiveness of using various methods for solving this problem of optimizing hyperparameters was made. It has shown that the use of classical Bayesian optimization which treats the model as a «black box» remains the standard solution. However, the models based on matrix factorization have a number of characteristic features. Their use makes it possible to introduce changes in the optimization process leading to a decrease in the time required to find the sought points without losing quality.Modification of the Gaussian process core which is used as a surrogate model for the loss function when performing the Bayesian optimization was proposed. The described modification at first iterations increases the variance of the values predicted by the Gaussian process over a given region of the hyperparameter space. In some cases, this makes it possible to obtain more information about the real form of the investigated loss function in less time.Experiments were carried out using well-known data sets for recommendatory systems. Total optimization time when applying the modification was reduced by 16 % (or 263 seconds) at best and remained the same at worst (less than 1-second difference). In this case, the expected error of the recommendatory model did not change (the absolute difference in values is two orders of magnitude lower than the value of error reduction in the optimization process). Thus, the use of the proposed modification contributes to finding a better set of hyperparameters in less time without loss of quality

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here