Federated Learning with Spiking Neural Networks

06/11/2021
by   Yeshwanth Venkatesha, et al.
9

As neural networks get widespread adoption in resource-constrained embedded devices, there is a growing need for low-power neural systems. Spiking Neural Networks (SNNs)are emerging to be an energy-efficient alternative to the traditional Artificial Neural Networks (ANNs) which are known to be computationally intensive. From an application perspective, as federated learning involves multiple energy-constrained devices, there is a huge scope to leverage energy efficiency provided by SNNs. Despite its importance, there has been little attention on training SNNs on a large-scale distributed system like federated learning. In this paper, we bring SNNs to a more realistic federated learning scenario. Specifically, we propose a federated learning framework for decentralized and privacy-preserving training of SNNs. To validate the proposed federated learning framework, we experimentally evaluate the advantages of SNNs on various aspects of federated learning with CIFAR10 and CIFAR100 benchmarks. We observe that SNNs outperform ANNs in terms of overall accuracy by over 15 when the data is distributed across a large number of clients in the federation while providing up to5.3x energy efficiency. In addition to efficiency, we also analyze the sensitivity of the proposed federated SNN framework to data distribution among the clients, stragglers, and gradient noise and perform a comprehensive comparison with ANNs.

READ FULL TEXT

page 1

page 4

page 5

page 11

research
02/27/2023

Communication Trade-offs in Federated Learning of Spiking Neural Networks

Spiking Neural Networks (SNNs) are biologically inspired alternatives to...
research
01/22/2023

Energy Prediction using Federated Learning

In this work, we demonstrate the viability of using federated learning t...
research
10/21/2019

Federated Neuromorphic Learning of Spiking Neural Networks for Low-Power Edge Intelligence

Spiking Neural Networks (SNNs) offer a promising alternative to conventi...
research
09/26/2022

An Energy Optimized Specializing DAG Federated Learning based on Event Triggered Communication

Specializing Directed Acyclic Graph Federated Learning(SDAGFL) is a new ...
research
07/16/2020

HyperTune: Dynamic Hyperparameter Tuning For Efficient Distribution of DNN Training Over Heterogeneous Systems

Distributed training is a novel approach to accelerate Deep Neural Netwo...
research
12/05/2021

Intrinisic Gradient Compression for Federated Learning

Federated learning is a rapidly-growing area of research which enables a...
research
02/05/2021

DEAL: Decremental Energy-Aware Learning in a Federated System

Federated learning struggles with their heavy energy footprint on batter...

Please sign up or login with your details

Forgot password? Click here to reset