https://scholars.lib.ntu.edu.tw/handle/123456789/632694
標題: | Anticipation-Free Training for Simultaneous Machine Translation | 作者: | Chang, Chih Chiang Chuang, Shun Po HUNG-YI LEE |
公開日期: | 1-一月-2022 | 來源出版物: | IWSLT 2022 - 19th International Conference on Spoken Language Translation, Proceedings of the Conference | 摘要: | Simultaneous machine translation (SimulMT) speeds up the translation process by starting to translate before the source sentence is completely available. It is difficult due to limited context and word order difference between languages. Existing methods increase latency or introduce adaptive read-write policies for SimulMT models to handle local reordering and improve translation quality. However, the long-distance reordering would make the SimulMT models learn translation mistakenly. Specifically, the model may be forced to predict target tokens when the corresponding source tokens have not been read. This leads to aggressive anticipation during inference, resulting in the hallucination phenomenon. To mitigate this problem, we propose a new framework that decompose the translation process into the monotonic translation step and the reordering step, and we model the latter by the auxiliary sorting network (ASN). The ASN rearranges the hidden states to match the order in the target language, so that the SimulMT model could learn to translate more reasonably. The entire model is optimized end-to-end and does not rely on external aligners or data. During inference, ASN is removed to achieve streaming. Experiments show the proposed framework could outperform previous methods with less latency. |
URI: | https://scholars.lib.ntu.edu.tw/handle/123456789/632694 | ISBN: | 9781955917414 |
顯示於: | 電機工程學系 |
在 IR 系統中的文件,除了特別指名其著作權條款之外,均受到著作權保護,並且保留所有的權利。