z-logo
open-access-imgOpen Access
Possibility of Decrease in a Level of Data Correlation During Processing Small Samples Using Neural Networks by Generating New Statistic Tests
Author(s) -
Alexander Ivanov,
А. Г. Банных,
П. С. Ложников,
A. E. Sulavko,
D. P. Inivatov
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1546/1/012080
Subject(s) - statistic , normality , artificial neural network , test statistic , correlation , statistics , computer science , statistical hypothesis testing , normality test , artificial intelligence , data mining , pattern recognition (psychology) , mathematics , geometry
Statistic tests created in the 20 th century may be mapped with some equivalent artificial neurons. As a result, a network of dozens of artificial neurons that combines dozens of known statistic tests may be used for the validation of the normality hypothesis. The quality of solutions made by a neural network depends on a number of used neurons (tests). This tendency gives rise to the task of creating new statistic tests (neurons) that first and foremost require low correlation of their decision with known tests. The paper presents a forecast of attainable confidential probabilities for the validation of the normality hypothesis for a small sample of 21 tests in a network consisting of 21 artificial neurons, where each one is mapped with one traditional statistic test. When new tests are used (that should be created in the 21 st century), the correlation of data is expected to lower by far, which should allow an approximately 10-fold decrease in a number of error probabilities.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here