https://scholars.lib.ntu.edu.tw/handle/123456789/632636
標題: | AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks | 作者: | Fu, Chin Lun Chen, Zih Ching Lee, Yun Ru HUNG-YI LEE |
公開日期: | 1-一月-2022 | 來源出版物: | Findings of the Association for Computational Linguistics: NAACL 2022 - Findings | 摘要: | Transformer-based pre-trained models with millions of parameters require large storage. Recent approaches tackle this shortcoming by training adapters, but these approaches still require a relatively large number of parameters. In this study, AdapterBias, a surprisingly simple yet effective adapter architecture, is proposed. AdapterBias adds a token-dependent shift to the hidden output of transformer layers to adapt to downstream tasks with only a vector and a linear layer. Extensive experiments are conducted to demonstrate the effectiveness of AdapterBias. The experiments show that our proposed method can dramatically reduce the trainable parameters compared to the previous works with a minimal decrease in task performances compared with fine-tuned pretrained models. We further find that Adapter- Bias automatically learns to assign more significant representation shifts to the tokens related to the task in consideration. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/632636 | ISBN: | 9781955917766 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。