Competitive plasticity to reduce the energetic costs of learning

by   Mark CW van Rossum, et al.

The brain is not only constrained by energy needed to fuel computation, but it is also constrained by energy needed to form memories. Experiments have shown that learning simple conditioning tasks already carries a significant metabolic cost. Yet, learning a task like MNIST to 95 require at least 10^8 synaptic updates. Therefore the brain has likely evolved to be able to learn using as little energy as possible. We explored the energy required for learning in feedforward neural networks. Based on a parsimonious energy model, we propose two plasticity restricting algorithms that save energy: 1) only modify synapses with large updates, and 2) restrict plasticity to subsets of synapses that form a path through the network. Combining these two methods leads to substantial energy savings while only incurring a small increase in learning time. In biology networks are often much larger than the task requires. In particular in that case, large savings can be achieved. Thus competitively restricting plasticity helps to save metabolic energy associated to synaptic plasticity. The results might lead to a better understanding of biological plasticity and a better match between artificial and biological learning. Moreover, the algorithms might also benefit hardware because in electronics memory storage is energetically costly as well.


page 1

page 2

page 3

page 4


Improving energy efficiency and classification accuracy of neuromorphic chips by learning binary synaptic crossbars

Deep Neural Networks (DNN) have achieved human level performance in many...

On-device Synaptic Memory Consolidation using Fowler-Nordheim Quantum-tunneling

Synaptic memory consolidation has been heralded as one of the key mechan...

Experimental Demonstration of Array-level Learning with Phase Change Synaptic Devices

The computational performance of the biological brain has long attracted...

Estimating the energy requirements for long term memory formation

Brains consume metabolic energy to process information, but also to stor...

The impact of input node placement in the controllability of brain networks

Network control theory can be used to model how one should steer the bra...

Using Machine Learning to reduce the energy wasted in Volunteer Computing Environments

High Throughput Computing (HTC) provides a convenient mechanism for runn...

Large Associative Memory Problem in Neurobiology and Machine Learning

Dense Associative Memories or modern Hopfield networks permit storage an...

Please sign up or login with your details

Forgot password? Click here to reset