電機資訊學院: 資訊工程學研究所指導教授: 陳信希莊文立Zhuang, Wen LiWen LiZhuang2017-03-032018-07-052017-03-032018-07-052016http://ntur.lib.ntu.edu.tw//handle/246246/275530指代(或譯作同指涉)消解是自然語言處理的經典未解之問題。我們提出一種全新的先行詞排序模型,利用階層式遞迴神經網路,先用一遞迴網路依文章的語境建造「提及語義」的表達式,再訓練另一個遞迴網路,使其善用剛剛學習出的表達式,搭配注意力機制,偵測照應詞及其指代之先行詞。我們的系統在CoNLL 2012的共享任務中,拿到了目前最高的分數。Coreference resolution is a classic unsolved problem in natural language processing. We present a novel antecedent ranking model based on hierarchical recurrent neural networks (RNN). The word-level RNN encodes the context into the representation of mention. The mention-level network is trained to learn to exploit these useful representation and few hand-crafted features to detect anaphora and its antecedent by simple attention mechanism. We evaluate our system on CoNLL 2012 shared task and set up a new state-of-the-art.1029734 bytesapplication/pdf論文公開時間: 2018/8/24論文使用權限: 同意有償授權(權利金給回饋學校)指代消解先行詞排序遞迴神經網路注意力機制Coreference resolutionantecedent rankingrecurrent neural networksattention mechanism基於遞迴神經網路的指代消解Coreference Resolution Using Recurrent Neural Networksthesis10.6342/NTU201602003http://ntur.lib.ntu.edu.tw/bitstream/246246/275530/1/ntu-105-R03922101-1.pdf