
Improving Convolutional Neural Network Expression via Difference Exponentially Linear Units
Author(s) -
Zhiying Hu,
Hua Huang,
Qinglin Ran,
Yuan Mingyang
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1651/1/012163
Subject(s) - mnist database , activation function , convolutional neural network , hyperbolic function , computer science , expression (computer science) , function (biology) , nonlinear system , exponential growth , artificial neural network , algorithm , artificial intelligence , pattern recognition (psychology) , speech recognition , mathematics , physics , mathematical analysis , programming language , quantum mechanics , evolutionary biology , biology
Convolutional Neural Network (CNN) has been applied to various tasks with great success. Adding an activation function is an important way to introduce nonlinearity into convolutional neural networks. The commonly used activation functions mostly use different forms of negative feedback for the negative input, however, some researchers have recently proposed positive feedback methods for the negative input such as Concatenated Rectified Linear Units (CReLU) and Linearly Scaled Hyperbolic Tangent (LiSHT) and achieved better performance. To explore this idea further more, we propose a new activation function called Difference Exponentially Linear Unit (DELU). The proposed DELU activation function can optionally provide positive and negative feedback for different values of negative inputs. Our experimental results on the commonly used datasets such as Fashion Mnist, CIFAR10, and Imagenet show that DELU outperforms other six activation functions, Leaky ReLU, ReLU, ELU, SELU, Swish, SERLU.