z-logo
open-access-imgOpen Access
Nonlinear optimization algorithm using monotonically increasing quantization resolution
Author(s) -
Seok Jinwuk,
Kim JeongSi
Publication year - 2023
Publication title -
etri journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.295
H-Index - 46
eISSN - 2233-7326
pISSN - 1225-6463
DOI - 10.4218/etrij.2021-0320
Subject(s) - quantization (signal processing) , linde–buzo–gray algorithm , mathematics , monotonic function , algorithm , mathematical optimization , optimization problem , independent and identically distributed random variables , random variable , mathematical analysis , statistics
Abstract We propose a quantized gradient search algorithm that can achieve global optimization by monotonically reducing the quantization step with respect to time when quantization is composed of integer or fixed‐point fractional values applied to an optimization algorithm. According to the white noise hypothesis states, a quantization step is sufficiently small and the quantization is well defined, the round‐off error caused by quantization can be regarded as a random variable with identically independent distribution. Thus, we rewrite the searching equation based on a gradient descent as a stochastic differential equation and obtain the monotonically decreasing rate of the quantization step, enabling the global optimization by stochastic analysis for deriving an objective function. Consequently, when the search equation is quantized by a monotonically decreasing quantization step, which suitably reduces the round‐off error, we can derive the searching algorithm evolving from an optimization algorithm. Numerical simulations indicate that due to the property of quantization‐based global optimization, the proposed algorithm shows better optimization performance on a search space to each iteration than the conventional algorithm with a higher success rate and fewer iterations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here