IP-HDC: Information-Preserved Hyperdimensional Computing for Multi-Task Learning
Journal
IEEE Workshop on Signal Processing Systems, SiPS: Design and Implementation
Journal Volume
2020-October
ISBN
9.78173E+12
Date Issued
2020
Author(s)
Abstract
Brain-inspired Hyperdimensional (HD) computing has shown great success in many real-world applications requiring low-power designs, e.g., edge computing for Internet of Things. For edge computing, multi-task learning (MTL) is more desirable than single-task learning due to more efficient model deployments for resource-constrained devices. As a lightweight algorithm, HD computing is compatible with MTL. However, despite its energy-efficiency, the memory overhead of HD computing significantly increases with the number of tasks under the MTL scenario. In this paper, we propose Information-Preserved HD computing (IP-HDC) to make the HD model simultaneously support multiple tasks with negligible memory overhead. Moreover, to mitigate interference between tasks, we introduce 'mask' HD vectors by extracting the informative dimensions of each task. Compared with the baseline method, IP-HDC shows a 22.9% of accuracy improvement and demonstrates both reliability and scalability by efficiently achieving isolation between tasks in the HD space. © 2020 IEEE.
Subjects
Brain-inspired computing; Hyperdimensional computing; Memory efficiency; Model compression; Multi-task learning
SDGs
Other Subjects
Edge computing; Electric power supplies to apparatus; Energy efficiency; Green computing; Internet protocols; Learning systems; Linearization; Signal processing; Silicon compounds; Accuracy Improvement; Baseline methods; Brain-inspired; Low-power design; Memory overheads; Multiple tasks; Resourceconstrained devices; Single task learning; Multi-task learning
Type
conference paper
