A General Framework for Adaptation of Neural Machine Translation to Simultaneous Translation

Yun Chen, Liangyou Li, Xin Jiang, Xiao Chen, Qun Liu


Abstract
Despite the success of neural machine translation (NMT), simultaneous neural machine translation (SNMT), the task of translating in real time before a full sentence has been observed, remains challenging due to the syntactic structure difference and simultaneity requirements. In this paper, we propose a general framework for adapting neural machine translation to translate simultaneously. Our framework contains two parts: prefix translation that utilizes a consecutive NMT model to translate source prefixes and a stopping criterion that determines when to stop the prefix translation. Experiments on three translation corpora and two language pairs show the efficacy of the proposed framework on balancing the quality and latency in adapting NMT to perform simultaneous translation.
Anthology ID:
2020.aacl-main.23
Volume:
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Kam-Fai Wong, Kevin Knight, Hua Wu
Venue:
AACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
191–200
Language:
URL:
https://aclanthology.org/2020.aacl-main.23
DOI:
10.18653/v1/2020.aacl-main.23
Bibkey:
Cite (ACL):
Yun Chen, Liangyou Li, Xin Jiang, Xiao Chen, and Qun Liu. 2020. A General Framework for Adaptation of Neural Machine Translation to Simultaneous Translation. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 191–200, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
A General Framework for Adaptation of Neural Machine Translation to Simultaneous Translation (Chen et al., AACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2020.aacl-main.23.pdf