Fast Feedforward Networks

08/28/2023
by   Peter Belcak, et al.
0

We break the linear link between the layer size and its inference cost by introducing the fast feedforward (FFF) architecture, a log-time alternative to feedforward networks. We demonstrate that FFFs are up to 220x faster than feedforward networks, up to 6x faster than mixture-of-experts networks, and exhibit better training properties than mixtures of experts thanks to noiseless conditional execution. Pushing FFFs to the limit, we show that they can use as little as 1 preserving 94.2

READ FULL TEXT
research
12/16/2013

Learning Factored Representations in a Deep Mixture of Experts

Mixtures of Experts combine the outputs of several "expert" networks, ea...
research
09/27/2015

Representation Benefits of Deep Feedforward Networks

This note provides a family of classification problems, indexed by a pos...
research
02/10/2020

Nonlinear Equation Solving: A Faster Alternative to Feedforward Computation

Feedforward computations, such as evaluating a neural network or samplin...
research
12/20/2013

Competitive Learning with Feedforward Supervisory Signal for Pre-trained Multilayered Networks

We propose a novel learning method for multilayered neural networks whic...
research
01/03/2023

Operator theory, kernels, and Feedforward Neural Networks

In this paper we show how specific families of positive definite kernels...
research
07/09/2018

Analysis of Statistical Properties of Nonlinear Feedforward Generators Over Finite Fields

Due to their simple construction, LFSRs are commonly used as building bl...
research
04/05/2020

Backprojection for Training Feedforward Neural Networks in the Input and Feature Spaces

After the tremendous development of neural networks trained by backpropa...

Please sign up or login with your details

Forgot password? Click here to reset