
Performance One-step secant Training Method for Forecasting Cases
Author(s) -
Ni Luh Wiwik Sri Rahayu Ginantra,
Gita Widi Bhawika,
GS Achmad Daengs,
Pawer Darasa Panjaitan,
Mohammad Aryo Arifin,
Anjar Wanto,
Muhammad Amin,
Harly Okprana,
Abdullah Syafii,
Asad Umar
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1933/1/012032
Subject(s) - backpropagation , computer science , artificial neural network , process (computing) , training (meteorology) , test data , secant method , function (biology) , algorithm , artificial intelligence , data mining , machine learning , pattern recognition (psychology) , newton's method , physics , nonlinear system , quantum mechanics , meteorology , evolutionary biology , biology , programming language , operating system
The training function used in the ANN method, especially backpropagation, can produce different forecasting accuracy, depending on the method parameters given and the data to be predicted. This paper aims to analyze the ability and performance of one of the training functions in the backpropagation algorithm, namely One-step secant, which can later be used or used as a reference in the case of data forecasting. This method is able to update the values of bias and weights according to the one-step secant method. The analysis process uses a dataset of Foreign Exchange Reserves (US $ Million) in Indonesia 2011-2020. Based on this dataset, the dataset will be divided into two parts. The training data uses the 2011-2014 and 2015 dataset as the training data target. Meanwhile, the test data used 2016-2019 and 2020 as the target test data. The analysis process uses 5 experimental architectures, namely 4-5-1, 4-7-1, 4-9-1, 4-11-1 and 4-13-1. The results of the research based on the analysis obtained the best network architecture 4-11-1 with an MSE Training value of 0.12, MSE testing/performance of 0.00115144 (the smallest compared to other architectures) and Epoch 343 Iterations.