
A Study on Different Functionalities and Performances among Different Activation Functions across Different ANNs for Image Classification
Author(s) -
Xia Zhang,
Di Chen,
Qi Wu,
Zhan Zhi-Ming
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1732/1/012026
Subject(s) - activation function , sigmoid function , image (mathematics) , computer science , artificial intelligence , artificial neural network , contextual image classification , pattern recognition (psychology) , machine learning
Activation functions are very essential in artificial neural networks (ANNs), since they are non-linear functions and they have been proved necessary to implement deep learning. Recently, ReLU is one of the most well-known activation functions; however, several competitors – e.g. LReLU and SWISH – have nowadays been proposed or ‘discovered’. In this paper, the authors perform a detailed comparison of five activation functions over two image classification datasets. We found the overall performances of accuracy rates ranked from the best GELU, RELU, SWISH, SELU, down to Sigmoid. Such the observation would result in the improvement of future image classification via designing new state-of-the-art activation functions.