Yanling Xiao
2022
BiTIIMT: A Bilingual Text-infilling Method for Interactive Machine Translation
Yanling Xiao
|
Lemao Liu
|
Guoping Huang
|
Qu Cui
|
Shujian Huang
|
Shuming Shi
|
Jiajun Chen
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Interactive neural machine translation (INMT) is able to guarantee high-quality translations by taking human interactions into account. Existing IMT systems relying on lexical constrained decoding (LCD) enable humans to translate in a flexible translation order beyond the left-to-right. However, they typically suffer from two significant limitations in translation efficiency and quality due to the reliance on LCD. In this work, we propose a novel BiTIIMT system, Bilingual Text-Infilling for Interactive Neural Machine Translation. The key idea to BiTIIMT is Bilingual Text-infilling (BiTI) which aims to fill missing segments in a manually revised translation for a given source sentence. We propose a simple yet effective solution by casting this task as a sequence-to-sequence task. In this way, our system performs decoding without explicit constraints and makes full use of revised words for better translation prediction. Experiment results show that BiTiIMT performs significantly better and faster than state-of-the-art LCD-based IMT on three translation tasks.
Search
Co-authors
- Lemao Liu 1
- Guoping Huang 1
- Qu Cui 1
- Shujian Huang 1
- Shuming Shi 1
- show all...
Venues
- acl1