z-logo
open-access-imgOpen Access
Deep Neural Network with Adaptive Parametric Rectified Linear Units and its Fast Learning
Author(s) -
Yevgeniy Bodyanskiy,
Anastasiia Deineko,
Viktoria Skorik,
Filip Brodetskyi
Publication year - 2022
Publication title -
computing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.184
H-Index - 11
eISSN - 2312-5381
pISSN - 1727-6209
DOI - 10.47839/ijc.21.1.2512
Subject(s) - activation function , computer science , artificial neural network , backpropagation , parametric statistics , convolutional neural network , feature (linguistics) , artificial intelligence , set (abstract data type) , algorithm , deep learning , function (biology) , control theory (sociology) , mathematics , linguistics , statistics , philosophy , control (management) , evolutionary biology , biology , programming language
The adaptive parametric rectified linear unit (AdPReLU) as an activation function of the deep neural network is proposed in the article. The main benefit of the proposed system is adjusted activation function whose parameters are tuning parallel with synaptic weights in online mode. The algorithm of the simultaneous learning of all neurons parameters with AdPReLU and the modified backpropagation procedure based on this algorithm is introduced. The approach under consideration permits to reduce volume of the training data set and increase tuning speed of the DNN with AdPReLU. The proposed approach could be applied in the deep convolutional neural networks (CNN) in conditions of the small value of training data sets and additional requirements for system performance. The main feature of DNN under consideration is possibility to tune not only synaptic weights but the parameters of activation function too. The effectiveness of this approach is proved by experimental modeling.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here