The Error Performances of Some Residual Optimization Methods
Author(s) -
Setyono
Publication year - 2016
Publication title -
informatika pertanian
Language(s) - English
Resource type - Journals
eISSN - 2540-9875
pISSN - 0852-1743
DOI - 10.21082/ip.v24n2.2015.p191-204
Subject(s) - residual , statistic , statistics , mathematics , residual sum of squares , mean squared error , press statistic , regression , regression analysis , algorithm , f test , total least squares , ancillary statistic
A good statistic is unbiased and efficient. Because the encountered data in practice is a sample data with a certain size, the required statistic is not unbiased statistic, but statistic that has small error. When the encountered data is only a sample data, then that can be done is not error optimization but is residual optimization. This study aims to examine the error performance of three methods of residual optimization, they are by minimizing the maximum of absolute residual (MLAD), by minimizing the sum of absolute residual (LAD), and by minimizing the sum of squared residual (LS). Research results using simulation experiments showed that if the data have uniform distribution, the residual optimization method by minimizing maximum of absolute residual get the smallest error. Meanwhile, residual optimization method by minimizing the sum of squared residual get the smallest error when the data have normal or exponential distribution. This property is true when statistics to be estimated are measure of central tendency, regression coefficients, and the response of regression.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom