FL-HDC: Hyperdimensional Computing Design for the Application of Federated Learning
Journal
2021 IEEE 3rd International Conference on Artificial Intelligence Circuits and Systems, AICAS 2021
ISBN
9.78167E+12
Date Issued
2021
Author(s)
Abstract
Federated learning (FL) is a privacy-preserving learning framework, which collaboratively learns a centralized model across edge devices. Each device trains an independent model with its local dataset and only uploads model parameters to mitigate privacy concerns. However, most FL works focus on deep neural networks (DNNs), whose intensive computation hinders FL from practical realization on resource-limited edge devices. In this paper, we exploit the high energy efficiency properties of hyperdimensional computing (HDC) to propose a federated learning HDC (FL-HDC). In FL-HDC, we bipolarize model parameters to significantly reduce communication costs, which is a primary concern in FL. Moreover, we propose a retraining mechanism with adaptive learning rates to compensate for the accuracy degradation caused by bipolarization. Under the FL scenario, our simulation results show the effectiveness of our proposed FL-HDC across two datasets, MNIST and ISOLET. Compared with the previous work that transmits complete model parameters to the cloud, FL-HDC greatly reduces 23x and 9x communication costs with comparable accuracy in ISOLET and MNIST, respectively. © 2021 IEEE.
Subjects
adaptive learning rate.; federated learning; Hyperdimensional computing; retraining mechanism
SDGs
Other Subjects
Cost reduction; Deep neural networks; Energy efficiency; Green computing; Privacy by design; Adaptive learning rates; Centralized models; Communication cost; High energy efficiency; Independent model; Learning frameworks; Privacy concerns; Privacy preserving; Deep learning
Type
conference paper
