On Search Strategies for Document-Level Neural Machine Translation

Christian Herold, Hermann Ney


Abstract
Compared to sentence-level systems, document-level neural machine translation (NMT) models produce a more consistent output across a document and are able to better resolve ambiguities within the input. There are many works on document-level NMT, mostly focusing on modifying the model architecture or training strategy to better accommodate the additional context-input. On the other hand, in most works, the question on how to perform search with the trained model is scarcely discussed, sometimes not mentioned at all. In this work, we aim to answer the question how to best utilize a context-aware translation model in decoding. We start with the most popular document-level NMT approach and compare different decoding schemes, some from the literature and others proposed by us. In the comparison, we are using both, standard automatic metrics, as well as specific linguistic phenomena on three standard document-level translation benchmarks. We find that most commonly used decoding strategies perform similar to each other and that higher quality context information has the potential to further improve the translation.
Anthology ID:
2023.findings-acl.811
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12827–12836
Language:
URL:
https://aclanthology.org/2023.findings-acl.811
DOI:
10.18653/v1/2023.findings-acl.811
Bibkey:
Cite (ACL):
Christian Herold and Hermann Ney. 2023. On Search Strategies for Document-Level Neural Machine Translation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12827–12836, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
On Search Strategies for Document-Level Neural Machine Translation (Herold & Ney, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-acl.811.pdf