https://scholars.lib.ntu.edu.tw/handle/123456789/607442
標題: | Distributed deep learning optimized system over the cloud and smart phone devices | 作者: | Jiang H Starkman J Lee Y.-J Chen H Qian X Huang M.-C. Ming-Chun HUANG |
關鍵字: | deep learning;Distributed deep neural networks;edge computing;mobile computing;wearable computers and body area networks;wearable healthcare;Data mining;Data privacy;Learning systems;Smartphones;Consensus models;Distributed data mining;Learning techniques;Network traffic;Optimized system;Parameter sharing;Personal trainings;Sensitive datas;Deep learning | 公開日期: | 2021 | 卷: | 20 | 期: | 1 | 起(迄)頁: | 147-161 | 來源出版物: | IEEE Transactions on Mobile Computing | 摘要: | Deep learning has been becoming a promising focus in data mining research. With deep learning techniques, researchers can discover deep properties and features of events from quantitative mobile sensor data. However, many data sources are geographically separated and have strict privacy, security, and regulatory constraints. Upon releasing the privacy-sensitive data, these data sources generally no longer physically possess their data and cannot interfere with the way their personal data being used. Therefore, it is necessary to explore distributed data mining architecture which is able to conduct consensus learning based on needs. Accordingly, we propose a distributed deep learning optimized system which contains a cloud server and multiple smartphone devices with computation capabilities and each device is served as a personal mobile data hub for enabling mobile computing while preserving data privacy. The proposed system keeps the private data locally in smartphones, shares trained parameters, and builds a global consensus model. The feasibility and usability of the proposed system are evaluated by three experiments and related discussion. The experimental results show that the proposed distributed deep learning system can reconstruct the behavior of centralized training. We also measure the cumulative network traffic in different scenarios and show that the partial parameter sharing strategy does not only preserve the performance of the trained model but also can reduce network traffic. User data privacy is protected on two levels. First, local private training data do not need to be shared with other people and the user has full control of their personal training data all the time. Second, only a small fraction of trained gradients of the local model are selected for sharing, which further reduces the risk of information leaking. ? 2002-2012 IEEE. |
URI: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85097799072&doi=10.1109%2fTMC.2019.2941492&partnerID=40&md5=9e6431dd16e958729e7067adb72efe4a https://scholars.lib.ntu.edu.tw/handle/123456789/607442 |
ISSN: | 15361233 | DOI: | 10.1109/TMC.2019.2941492 |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。