Jointly modeling inter-slot relations by random walk on knowledge graphs for unsupervised spoken language understanding
Journal
2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Pages
619-629
Date Issued
2015
Author(s)
URI
Abstract
A key challenge of designing coherent semantic ontology for spoken language understanding is to consider inter-slot relations. In practice, however, it is difficult for domain experts and professional annotators to define a coherent slot set, while considering various lexical, syntactic, and semantic dependencies. In this paper, we exploit the typed syntactic dependency theory for unsupervised induction and filling of semantics slots in spoken dialogue systems. More specifically, we build two knowledge graphs: a slot-based semantic graph, and a word-based lexical graph. To jointly consider word-to-word, word-toslot, and slot-to-slot relations, we use a random walk inference algorithm to combine the two knowledge graphs, guided by dependency grammars. The experiments show that considering inter-slot relations is crucial for generating a more coherent and compete slot set, resulting in a better spoken language understanding model, while enhancing the interpretability of semantic slots. © 2015 Association for Computational Linguistics.
SDGs
Other Subjects
Computational linguistics; Graph algorithms; Inference engines; Knowledge representation; Random processes; Semantics; Speech processing; Syntactics; Coherent semantics; Dependency grammar; Inference algorithm; Knowledge graphs; Semantic dependency; Spoken dialogue system; Spoken language understanding; Syntactic dependencies; Modeling languages
Type
conference paper
