Feedforward Neural Networks with Diffused Nonlinear Weight Functions

10/27/2003
by   Artur Rataj, et al.
0

In this paper, feedforward neural networks are presented that have nonlinear weight functions based on look--up tables, that are specially smoothed in a regularization called the diffusion. The idea of such a type of networks is based on the hypothesis that the greater number of adaptive parameters per a weight function might reduce the total number of the weight functions needed to solve a given problem. Then, if the computational complexity of a propagation through a single such a weight function would be kept low, then the introduced neural networks might possibly be relatively fast. A number of tests is performed, showing that the presented neural networks may indeed perform better in some cases than the classic neural networks and a number of other learning machines.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro