Improving Performance in Neural Networks by Dendrites-Activated Connections

01/03/2023
by   Carlo Metta, et al.
0

Computational units in artificial neural networks compute a linear combination of their inputs, and then apply a nonlinear filter, often a ReLU shifted by some bias, and if the inputs come themselves from other units, they were already filtered with their own biases. In a layer, multiple units share the same inputs, and each input was filtered with a unique bias, resulting in output values being based on shared input biases rather than individual optimal ones. To mitigate this issue, we introduce DAC, a new computational unit based on preactivation and multiple biases, where input signals undergo independent nonlinear filtering before the linear combination. We provide a Keras implementation and report its computational efficiency. We test DAC convolutions in ResNet architectures on CIFAR-10, CIFAR-100, Imagenette, and Imagewoof, and achieve performance improvements of up to 1.73 examples where DAC is more efficient than its standard counterpart as a function approximator, and we prove a universal representation theorem.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset