Su, Shang YuShang YuSuChung, Yung SungYung SungChungYUN-NUNG CHEN2024-04-182024-04-182024-01-0123299290https://scholars.lib.ntu.edu.tw/handle/123456789/641967Modular conversational systems heavily rely on the performance of their natural language understanding (NLU) and natural language generation (NLG) components. NLU focuses on extracting core semantic concepts from input texts, while NLG constructs coherent sentences based on these extracted semantics. Inspired by information theory in digital communication, we introduce a one-way communication model that mirrors human conversations, comprising two distinct phases: (1) the conversion of thoughts into messages, similar to NLG, and (2) the comprehension of received messages, similar to NLU. This paper presents a novel algorithm that trains NLU and NLG collaboratively by concatenating their models and maximizing mutual information between inputs and outputs. This approach efficiently facilitates the transmission of semantics, leading to enhanced learning performance for both components. Our experimental results, based on three benchmark datasets, consistently demonstrate significant improvements for both NLU and NLG tasks, highlighting the practical promise of our proposed method.Dual learning | mutual information | Mutual information | natural language generation | Natural language processing | natural language understanding | Natural languages | Oral communication | Semantics | Task analysis | TrainingJoint Dual Learning with Mutual Information Maximization for Natural Language Understanding and Generation in Dialoguesjournal article10.1109/TASLP.2024.33640632-s2.0-85188911282https://api.elsevier.com/content/abstract/scopus_id/85188911282