Performance Analysis and Optimization for Federated Learning Applications with PySyft-based Secure Aggregation
Journal
Proceedings - 2020 International Computer Symposium, ICS 2020
Pages
191-196
Date Issued
2020
Author(s)
Abstract
To address privacy concerns, federated learning (FL) is becoming a promising machine learning technique which enables multiple decentralized clients to train a shared model collaboratively while preserving their private training data. Although FL may reduce the risks of data leak, it is still possible for hackers to reverse-engineer a trained model and figure out the information in the original training dataset provided by a FL client. In order to avoid such risks, secure aggregation (SA) can be used to privately combine the trained models of the clients to update the shared model. However, SA usually introduces performance overhead as it requires additional computation for encryption operations and even communications when secure multi-party computation (SMPC) is used. In this paper, we analyze the performance of FL with SA using PySyft, an open source framework including FL implementation, and propose an asynchronous FL mechanism to improve the overall performance. It turns out that the performance depends on the computational capabilities of the clients and the characteristics of the communication network, and we propose a performance modeling method to help system designers break down the execution time and decide on suitable trade-offs between privacy, efficiency, and accuracy for a balanced system. ? 2020 IEEE.
Subjects
Data Privacy; Federated Learning; Performance Modeling; Secure Aggregation
Other Subjects
Cryptography; Economic and social effects; Personal computing; Personnel training; Privacy by design; Computational capability; Encryption operations; Machine learning techniques; Open source frameworks; Performance analysis and optimizations; Performance Model; Secure aggregations; Secure multi-party computation; Learning systems
Type
conference paper
