z-logo
open-access-imgOpen Access
Low‐complexity neuron for fixed‐point artificial neural networks with ReLU activation function in energy‐constrained wireless applications
Author(s) -
Chin WenLong,
Zhang Qinyu,
Jiang Tao
Publication year - 2021
Publication title -
iet communications
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.355
H-Index - 62
eISSN - 1751-8636
pISSN - 1751-8628
DOI - 10.1049/cmu2.12129
Subject(s) - fixed point , activation function , computer science , artificial neural network , wireless , energy (signal processing) , function (biology) , neuron , neuroscience , artificial intelligence , mathematics , telecommunications , biology , mathematical analysis , statistics , evolutionary biology
This work introduces an efficient neuron design for fixed‐point artificial neural networks with the rectified linear unit (ReLU) activation function for energy‐constrained wireless applications. The fixed‐point binary numbers and ReLU activation function are used in most application‐specific integrated circuit designs and artificial neural networks (ANN), respectively. It is well known that, owing to involved computation intensive tasks, the computational burden of ANNs is ultra heavy. Consequently, many practitioners and researchers are discovering the ways to reduce implementation complexity of ANNs, particularly for battery‐powered wireless applications. For this, a low‐complexity neuron to predict the sign bit of the input of the non‐linear activation function, ReLU, by employing the saturation characteristics of the activation function is proposed. According to our simulation results based on random data, computation overhead of a neuron using the proposed technique can be saved by a ratio of 29.6% compared to the conventional neuron using a word length of 8 bits without apparently increasing the prediction error. A comparison of the proposed algorithm with the popular 16‐bit fixed‐point format of the convolutional network, AlexNet, indicates that the computation can be saved by 48.58% as well.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here