z-logo
open-access-imgOpen Access
Normality Testing for Vectors on Perceptron Layers
Author(s) -
Youmna Shawki Karaki,
Halina Kaubasa,
Nick Ivanov
Publication year - 2020
Publication title -
european journal of engineering and technology research
Language(s) - English
Resource type - Journals
ISSN - 2736-576X
DOI - 10.24018/ejeng.2020.5.9.2090
Subject(s) - normality , multivariate normal distribution , hyperparameter , artificial neural network , computer science , perceptron , artificial intelligence , gaussian , machine learning , multilayer perceptron , multivariate statistics , pattern recognition (psychology) , mathematics , statistics , physics , quantum mechanics
Designing optimal topology of network graph is one of the most prevalent issues in neural network applications. Number of hidden layers, number of nodes in layers, activation functions, and other parameters of neural networks must suit the given data set and the prevailing problem. Massive learning datasets prompt a researcher to exploit probability methods in an attempt to find optimal structure of a neural network. Classic Bayesian estimation of network hyperparameters assumes distribution of specific random parameters to be Gaussian. Multivariate Normality Analysis methods are widespread in contemporary applied mathematics. In this article, the normality of probability distribution of vectors on perceptron layers was examined by the Multivariate Normality Test. Ten datasets from University of California, Irvine were selected for the computing experiment. The result of our hypothesis on Gaussian distribution is negative, ensuring that none of the set of vectors passed the criteria of normality.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here