
Random Number Generator Untuk Bobot Metode Conjugate Gradient Neural Network
Author(s) -
Yudistira Arya Sapoetra,
Azwar Riza Habibi,
Lukman Hakim
Publication year - 2019
Publication title -
jurnal derivate/jurnal derivat
Language(s) - English
Resource type - Journals
eISSN - 2549-2616
pISSN - 2407-3792
DOI - 10.31316/j.derivat.v4i1.161
Subject(s) - conjugate gradient method , artificial neural network , weighting , backpropagation , convergence (economics) , computer science , generator (circuit theory) , algorithm , mathematics , artificial intelligence , physics , power (physics) , quantum mechanics , acoustics , economics , economic growth
This research develops the theory of NN (neural network) by using CG (conjugate gradient) to speed up the process of convergence on a network of NN. CG algorithm is an iterative algorithm to solve simultaneous linear equations on a large scale and it is used to optimize the process of the network on backpropagation. In the process, a Neural netwok doing random weighting on the weight of v and w and this weight will have an effect on the speed of convergence of an algorithm for NN by the method of CG. Furthermore, generating the random numbers to take a sample as a generator in this research of neural network by using uniform distribution (0,1) methods. Therefore, the aims of this research are to improve the convergence on NN weighting using numbers which are generated randomly by the generator and the will be corrected with the CG method.Keywords: neural network, backpropagation, weighting, conjugate gradient