AN-YEU(ANDY) WU2022-05-192022-05-1920219.78167E+12https://www.scopus.com/inward/record.uri?eid=2-s2.0-85113291061&doi=10.1109%2fAICAS51828.2021.9458526&partnerID=40&md5=b29034f114bcf3c53b374453d179b318https://scholars.lib.ntu.edu.tw/handle/123456789/611192Federated learning (FL) is a privacy-preserving learning framework, which collaboratively learns a centralized model across edge devices. Each device trains an independent model with its local dataset and only uploads model parameters to mitigate privacy concerns. However, most FL works focus on deep neural networks (DNNs), whose intensive computation hinders FL from practical realization on resource-limited edge devices. In this paper, we exploit the high energy efficiency properties of hyperdimensional computing (HDC) to propose a federated learning HDC (FL-HDC). In FL-HDC, we bipolarize model parameters to significantly reduce communication costs, which is a primary concern in FL. Moreover, we propose a retraining mechanism with adaptive learning rates to compensate for the accuracy degradation caused by bipolarization. Under the FL scenario, our simulation results show the effectiveness of our proposed FL-HDC across two datasets, MNIST and ISOLET. Compared with the previous work that transmits complete model parameters to the cloud, FL-HDC greatly reduces 23x and 9x communication costs with comparable accuracy in ISOLET and MNIST, respectively. © 2021 IEEE.adaptive learning rate.; federated learning; Hyperdimensional computing; retraining mechanism[SDGs]SDG7Cost reduction; Deep neural networks; Energy efficiency; Green computing; Privacy by design; Adaptive learning rates; Centralized models; Communication cost; High energy efficiency; Independent model; Learning frameworks; Privacy concerns; Privacy preserving; Deep learningFL-HDC: Hyperdimensional Computing Design for the Application of Federated Learningconference paper10.1109/AICAS51828.2021.94585262-s2.0-85113291061