Prompt-Driven Neural Machine Translation

Yafu Li, Yongjing Yin, Jing Li, Yue Zhang


Abstract
Neural machine translation (NMT) has obtained significant performance improvement over the recent years. However, NMT models still face various challenges including fragility and lack of style flexibility. Moreover, current methods for instance-level constraints are limited in that they are either constraint-specific or model-specific. To this end, we propose prompt-driven neural machine translation to incorporate prompts for enhancing translation control and enriching flexibility. Empirical results demonstrate the effectiveness of our method in both prompt responding and translation quality. Through human evaluation, we further show the flexibility of prompt control and the efficiency in human-in-the-loop translation.
Anthology ID:
2022.findings-acl.203
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2579–2590
Language:
URL:
https://aclanthology.org/2022.findings-acl.203
DOI:
10.18653/v1/2022.findings-acl.203
Bibkey:
Cite (ACL):
Yafu Li, Yongjing Yin, Jing Li, and Yue Zhang. 2022. Prompt-Driven Neural Machine Translation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2579–2590, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Prompt-Driven Neural Machine Translation (Li et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2022.findings-acl.203.pdf
Code
 yafuly/promptnmt