Effect of shapes of activation functions on predictability in the echo state network

05/22/2019
by   Hanten Chang, et al.
0

We investigate prediction accuracy for time series of Echo state networks with respect to several kinds of activation functions. As a result, we found that some kinds of activation functions with an appropriate nonlinearity show high performance compared to the conventional sigmoid function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2021

Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks

We have proposed orthogonal-Padé activation functions, which are trainab...
research
08/29/2022

Normalized Activation Function: Toward Better Convergence

Activation functions are essential for neural networks to introduce non-...
research
03/21/2017

Evolving Parsimonious Networks by Mixing Activation Functions

Neuroevolution methods evolve the weights of a neural network, and in so...
research
07/29/2021

Otimizacao de pesos e funcoes de ativacao de redes neurais aplicadas na previsao de series temporais

Neural Networks have been applied for time series prediction with good e...
research
10/17/2020

Squashing activation functions in benchmark tests: towards eXplainable Artificial Intelligence using continuous-valued logic

Over the past few years, deep neural networks have shown excellent resul...
research
08/18/2022

Lifted Bregman Training of Neural Networks

We introduce a novel mathematical formulation for the training of feed-f...
research
11/21/2020

Central and Non-central Limit Theorems arising from the Scattering Transform and its Neural Activation Generalization

Motivated by analyzing complicated and non-stationary time series, we st...

Please sign up or login with your details

Forgot password? Click here to reset