Activation functions are not needed: the ratio net

05/14/2020
by   Chi-Chun Zhou, et al.
0

The function approximator that finds the function mapping the feature to the label is an important component in a deep neural network for classification tasks. To overcome nonlinearity, which is the main difficulty in designing the function approximator, one usually uses the method based on the nonlinear activation function or the nonlinear kernel function and yields classical networks such as the feed-forward neural network (MLP) and the radial basis function network (RBF). Although, classical networks such as the MLP are robust in most of the classification task, they are not the most efficient. E.g., they use large amount of parameters and take long times to train. Additionally, the choice of activation functions has a non-negligible influence on the effectiveness and efficiency of the network. In this paper, we propose a new network that is efficient in finding the function that maps the feature to the label. Instead of using the nonlinear activation function, the new proposed network uses the fractional form to overcome the nonlinearity, thus for the sake of convenience, we name the network the ratio net. We compare the effectiveness and efficiency of the ratio net and the classical networks such as the MLP and the RBF in the classification task on the mnist database of handwritten digits and the IMDb dataset which is a binary sentiment analysis dataset. The result shows that the ratio net outperforms both the MLP and the RBF.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2023

Lon-eå at SemEval-2023 Task 11: A Comparison of Activation Functions for Soft and Hard Label Prediction

We study the influence of different activation functions in the output l...
research
10/30/2017

Empirical analysis of non-linear activation functions for Deep Neural Networks in classification tasks

We provide an overview of several non-linear activation functions in a n...
research
02/27/2023

Moderate Adaptive Linear Units (MoLU)

We propose a new high-performance activation function, Moderate Adaptive...
research
09/05/2020

Binary Classification as a Phase Separation Process

We propose a new binary classification model called Phase Separation Bin...
research
06/25/2020

Q-NET: A Formula for Numerical Integration of a Shallow Feed-forward Neural Network

Numerical integration is a computational procedure that is widely encoun...
research
11/13/2015

Learning to Assign Orientations to Feature Points

We show how to train a Convolutional Neural Network to assign a canonica...
research
04/20/2018

A Simple Quantum Neural Net with a Periodic Activation Function

In this paper, we propose a simple neural net that requires only O(nlog_...

Please sign up or login with your details

Forgot password? Click here to reset