Goo C.-WGao GHsu Y.-KHuo C.-LChen T.-CHsu K.-WYUN-NUNG CHEN2021-09-022021-09-022018https://www.scopus.com/inward/record.uri?eid=2-s2.0-85057752937&partnerID=40&md5=b0765c60c2a7bd01e9410ee65e60a209https://scholars.lib.ntu.edu.tw/handle/123456789/581488Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this paper proposes a slot gate that focuses on learning the relationship between intent and slot attention vectors in order to obtain better semantic frame results by the global optimization. The experiments show that our proposed model significantly improves sentence-level semantic frame accuracy with 4.2% and 1.9% relative improvement compared to the attentional model on benchmark ATIS and Snips datasets respectively. ? 2018 Association for Computational Linguistics.Computational linguistics; Global optimization; Semantics; Intent detection; Recurrent neural network model; Semantic frames; Sentence level; State-of-the-art performance; Recurrent neural networksSlot-gated modeling for joint slot filling and intent predictionconference paper10.18653/v1/n18-21182-s2.0-85057752937