Training Feedforward Neural Networks with Standard Logistic Activations is Feasible

10/03/2017
by   Emanuele Sansone, et al.
0

Training feedforward neural networks with standard logistic activations is considered difficult because of the intrinsic properties of these sigmoidal functions. This work aims at showing that these networks can be trained to achieve generalization performance comparable to those based on hyperbolic tangent activations. The solution consists on applying a set of conditions in parameter initialization, which have been derived from the study of the properties of a single neuron from an information-theoretic perspective. The proposed initialization is validated through an extensive experimental analysis.

READ FULL TEXT

page 8

page 9

research
12/05/2022

Improved Convergence Guarantees for Shallow Neural Networks

We continue a long line of research aimed at proving convergence of dept...
research
05/10/2005

Distant generalization by feedforward neural networks

This paper discusses the notion of generalization of training samples ov...
research
05/24/2022

Imposing Gaussian Pre-Activations in a Neural Network

The goal of the present work is to propose a way to modify both the init...
research
10/17/2022

Measures of Information Reflect Memorization Patterns

Neural networks are known to exploit spurious artifacts (or shortcuts) t...
research
01/28/2022

On feedforward control using physics-guided neural networks: Training cost regularization and optimized initialization

Performance of model-based feedforward controllers is typically limited ...
research
07/30/2021

Validation of RELU nets with tropical polyhedra

This paper studies the problem of range analysis for feedforward neural ...
research
02/26/2016

Bounded Rational Decision-Making in Feedforward Neural Networks

Bounded rational decision-makers transform sensory input into motor outp...

Please sign up or login with your details

Forgot password? Click here to reset