z-logo
open-access-imgOpen Access
SoftMax Neural Best Approximation
Author(s) -
Hawraa Abbas Almurieb,
Eman Samir Bhaya
Publication year - 2020
Publication title -
iop conference series. materials science and engineering
Language(s) - English
Resource type - Journals
eISSN - 1757-899X
pISSN - 1757-8981
DOI - 10.1088/1757-899x/871/1/012040
Subject(s) - softmax function , sigmoid function , integrable system , smoothness , lebesgue integration , artificial neural network , mathematics , modulus of continuity , activation function , function (biology) , nonlinear system , pure mathematics , mathematical analysis , computer science , artificial intelligence , physics , ecology , quantum mechanics , type (biology) , evolutionary biology , biology
Neural networks have a great place in approximating nonlinear functions, especially those Lebesgue integrable functions that are approximated by FNNs with one hidden layer and sigmoidal functions. Various operators of neural networks have been defined and achieved to get good rates of approximation depending on the modulus of smoothness. Here we define a new neural network operator with a generalized sigmoidal function (SoftMax) to improve the rate of approximation of a Lebesgue integrable function L p , with p < 1, to be estimated using modulus of smoothness of order k. The importance of choosing SoftMax function as an activation function is its flexible properties and various applications.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here