z-logo
open-access-imgOpen Access
Federal SNN Distillation: A Low-Communication-Cost Federated Learning Framework for Spiking Neural Networks
Author(s) -
Zhetong Liu,
Qiugang Zhan,
Xiurui Xie,
Bingchao Wang,
Guisong Liu
Publication year - 2022
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2216/1/012078
Subject(s) - mnist database , computer science , spiking neural network , lossless compression , artificial neural network , artificial intelligence , machine learning , computer engineering , data compression
In recent years, research on the federated spiking neural network (SNN) framework has attracted increasing attention in the area of on-chip learning for embedded devices, because of its advantages of low power consumption and privacy security. Most of the existing federated SNN frameworks are based on the classical federated learning framework -- Federated Average (FedAvg) framework, where internal communication is achieved by exchanging network parameters or gradients. However, although these frameworks take a series of methods to reduce the communication cost, the communication of frameworks still increases with the scale of the backbone network. To solve the problem, we propose a new federated SNN framework, Federal SNN distillation (FSD), whose communication is independent of the scale of the network. Through the idea of knowledge distillation, FSD replaces the network parameters or gradients with the output spikes of SNN, which greatly reduces the communication while ensuring the effect. In addition, we propose a lossless compression algorithm to further compress the binary output spikes of SNN. The proposed framework FSD is compared with the existing FedAvg frameworks on MNIST, Fashion MNIST and CIFAR10 datasets. The experiment results demonstrate that FSD communication is decreased by 1-2 orders of magnitude when reaching the same accuracy.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here