z-logo
open-access-imgOpen Access
A Non-Polynomial, Non-Sigmoidal, Bounded and Symmetric Activation Function for Feed – Forward Artificial Neural Networks
Author(s) -
Apoorvi Sood*,
P. Chandra,
Udayan Ghose
Publication year - 2019
Publication title -
international journal of innovative technology and exploring engineering
Language(s) - English
Resource type - Journals
ISSN - 2278-3075
DOI - 10.35940/ijitee.l3313.1081219
Subject(s) - sigmoid function , activation function , hyperbolic function , artificial neural network , bounded function , polynomial , function (biology) , mathematics , computer science , feedforward neural network , tangent , mathematical optimization , artificial intelligence , mathematical analysis , geometry , evolutionary biology , biology
Feed-forward artificial neural networks are universal approximators of continuous functions. This property enables the use of these networks to solve learning tasks. Learning tasks in this paradigm are cast as function approximation problems. The universal approximation results for these networks require at least one hidden layer with non-linear nodes, and also require that the non-linearities be non-polynomial in nature. In this paper a non-polynomial and non-sigmoidal non-linear function is proposed as a suitable activation function for these networks. The usefulness of the proposed activation function is shown on 12 function approximation task. The obtained results demonstrate that the proposed activation function outperforms the logistic / log-sigmoid and the hyperbolic tangent activation functions.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here