https://scholars.lib.ntu.edu.tw/handle/123456789/581319
標題: | Compacting, picking and growing for unforgetting continual learning | 作者: | Hung S.C.Y Tu C.-H Wu C.-E Chen C.-H Chan Y.-M CHU-SONG CHEN |
關鍵字: | Deep learning; Iterative methods; Continual learning; Effective approaches; Incremental learning; Life long learning; Model compression; Model expansion; Sequential task; Task trainings; Learning systems | 公開日期: | 2019 | 卷: | 32 | 來源出版物: | Advances in Neural Information Processing Systems | 摘要: | Continual lifelong learning is essential to many applications. In this paper, we propose a simple but effective approach to continual deep learning. Our approach leverages the principles of deep model compression, critical weights selection, and progressive networks expansion. By enforcing their integration in an iterative manner, we introduce an incremental learning method that is scalable to the number of sequential tasks in a continual learning process. Our approach is easy to implement and owns several favorable characteristics. First, it can avoid forgetting (i.e., learn new tasks while remembering all previous tasks). Second, it allows model expansion but can maintain the model compactness when handling sequential tasks. Besides, through our compaction and selection/expansion mechanism, we show that the knowledge accumulated through learning previous tasks is helpful to build a better model for the new tasks compared to training the models independently with tasks. Experimental results show that our approach can incrementally learn a deep model tackling multiple tasks without forgetting, while the model compactness is maintained with the performance more satisfiable than individual task training. ? 2019 Neural information processing systems foundation. All rights reserved. |
URI: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85090178469&partnerID=40&md5=2eb67161c70a1dcfa72bca7c62e69ea5 https://scholars.lib.ntu.edu.tw/handle/123456789/581319 |
ISSN: | 10495258 |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。