ErfReLU: Adaptive Activation Function for Deep Neural Network

06/02/2023
by   Ashish Rajanand, et al.
0

Recent research has found that the activation function (AF) selected for adding non-linearity into the output can have a big impact on how effectively deep learning networks perform. Developing activation functions that can adapt simultaneously with learning is a need of time. Researchers recently started developing activation functions that can be trained throughout the learning process, known as trainable, or adaptive activation functions (AAF). Research on AAF that enhance the outcomes is still in its early stages. In this paper, a novel activation function 'ErfReLU' has been developed based on the erf function and ReLU. This function exploits the ReLU and the error function (erf) to its advantage. State of art activation functions like Sigmoid, ReLU, Tanh, and their properties have been briefly explained. Adaptive activation functions like Tanhsoft1, Tanhsoft2, Tanhsoft3, TanhLU, SAAF, ErfAct, Pserf, Smish, and Serf have also been described. Lastly, performance analysis of 9 trainable activation functions along with the proposed one namely Tanhsoft1, Tanhsoft2, Tanhsoft3, TanhLU, SAAF, ErfAct, Pserf, Smish, and Serf has been shown by applying these activation functions in MobileNet, VGG16, and ResNet models on CIFAR-10, MNIST, and FMNIST benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2022

Activation Functions: Dive into an optimal activation function

Activation functions have come up as one of the essential components of ...
research
05/02/2020

A survey on modern trainable activation functions

In the literature, there is a strong interest to identify and define act...
research
01/15/2022

Phish: A Novel Hyper-Optimizable Activation Function

Deep-learning models estimate values using backpropagation. The activati...
research
06/26/2018

Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks

The most widely used activation functions in current deep feed-forward n...
research
01/17/2019

Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison

An appropriate choice of the activation function (like ReLU, sigmoid or ...
research
05/20/2023

GELU Activation Function in Deep Learning: A Comprehensive Mathematical Analysis and Performance

Selecting the most suitable activation function is a critical factor in ...
research
07/02/2023

ENN: A Neural Network with DCT-Adaptive Activation Functions

The expressiveness of neural networks highly depends on the nature of th...

Please sign up or login with your details

Forgot password? Click here to reset