Non-Autoregressive Document-Level Machine Translation

Guangsheng Bao, Zhiyang Teng, Hao Zhou, Jianhao Yan, Yue Zhang


Abstract
Non-autoregressive translation (NAT) models achieve comparable performance and superior speed compared to auto-regressive translation (AT) models in the context of sentence-level machine translation (MT). However, their abilities are unexplored in document-level MT, hindering their usage in real scenarios. In this paper, we conduct a comprehensive examination of typical NAT models in the context of document-level MT and further propose a simple but effective design of sentence alignment between source and target. Experiments show that NAT models achieve high acceleration on documents, and sentence alignment significantly enhances their performance. However, current NAT models still have a significant performance gap compared to their AT counterparts. Further investigation reveals that NAT models suffer more from the multi-modality and misalignment issues in the context of document-level MT, and current NAT models struggle with exploiting document context and handling discourse phenomena. We delve into these challenges and provide our code at https://github.com/baoguangsheng/nat-on-doc.
Anthology ID:
2023.findings-emnlp.986
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14791–14803
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.986
DOI:
10.18653/v1/2023.findings-emnlp.986
Bibkey:
Cite (ACL):
Guangsheng Bao, Zhiyang Teng, Hao Zhou, Jianhao Yan, and Yue Zhang. 2023. Non-Autoregressive Document-Level Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14791–14803, Singapore. Association for Computational Linguistics.
Cite (Informal):
Non-Autoregressive Document-Level Machine Translation (Bao et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-emnlp.986.pdf