https://scholars.lib.ntu.edu.tw/handle/123456789/641967
標題: | Joint Dual Learning with Mutual Information Maximization for Natural Language Understanding and Generation in Dialogues | 作者: | Su, Shang Yu Chung, Yung Sung YUN-NUNG CHEN |
關鍵字: | Dual learning | mutual information | Mutual information | natural language generation | Natural language processing | natural language understanding | Natural languages | Oral communication | Semantics | Task analysis | Training | 公開日期: | 1-一月-2024 | 來源出版物: | IEEE/ACM Transactions on Audio Speech and Language Processing | 摘要: | Modular conversational systems heavily rely on the performance of their natural language understanding (NLU) and natural language generation (NLG) components. NLU focuses on extracting core semantic concepts from input texts, while NLG constructs coherent sentences based on these extracted semantics. Inspired by information theory in digital communication, we introduce a one-way communication model that mirrors human conversations, comprising two distinct phases: (1) the conversion of thoughts into messages, similar to NLG, and (2) the comprehension of received messages, similar to NLU. This paper presents a novel algorithm that trains NLU and NLG collaboratively by concatenating their models and maximizing mutual information between inputs and outputs. This approach efficiently facilitates the transmission of semantics, leading to enhanced learning performance for both components. Our experimental results, based on three benchmark datasets, consistently demonstrate significant improvements for both NLU and NLG tasks, highlighting the practical promise of our proposed method. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/641967 | ISSN: | 23299290 | DOI: | 10.1109/TASLP.2024.3364063 |
顯示於: | 資訊工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。