STL: A Signed and Truncated Logarithm Activation Function for Neural Networks

07/31/2023
by   Yuanhao Gong, et al.
0

Activation functions play an essential role in neural networks. They provide the non-linearity for the networks. Therefore, their properties are important for neural networks' accuracy and running performance. In this paper, we present a novel signed and truncated logarithm function as activation function. The proposed activation function has significantly better mathematical properties, such as being odd function, monotone, differentiable, having unbounded value range, and a continuous nonzero gradient. These properties make it an excellent choice as an activation function. We compare it with other well-known activation functions in several well-known neural networks. The results confirm that it is the state-of-the-art. The suggested activation function can be applied in a large range of neural networks where activation functions are necessary.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/09/2023

TSSR: A Truncated and Signed Square Root Activation Function for Neural Networks

Activation functions are essential components of neural networks. In thi...
research
05/19/2023

Complexity of Neural Network Training and ETR: Extensions with Effectively Continuous Functions

We study the complexity of the problem of training neural networks defin...
research
03/18/2020

A Survey on Activation Functions and their relation with Xavier and He Normal Initialization

In artificial neural network, the activation function and the weight ini...
research
05/25/2023

Embeddings between Barron spaces with higher order activation functions

The approximation properties of infinitely wide shallow neural networks ...
research
05/15/2020

A New Activation Function for Training Deep Neural Networks to Avoid Local Minimum

Activation functions have a major role to play and hence are very import...
research
06/26/2018

Gradient Acceleration in Activation Functions

Dropout has been one of standard approaches to train deep neural network...
research
03/01/2016

Noisy Activation Functions

Common nonlinear activation functions used in neural networks can cause ...

Please sign up or login with your details

Forgot password? Click here to reset