Abstract
Resolving semantic ambiguity has long been recognised as a central challenge in the field of Machine Translation. Recent work on benchmarking translation performance on ambiguous sentences has exposed the limitations of conventional Neural Machine Translation (NMT) systems, which fail to handle many such cases. Large language models (LLMs) have emerged as a promising alternative, demonstrating comparable performance to traditional NMT models while introducing new paradigms for controlling the target outputs. In this paper, we study the capabilities of LLMs to translate “ambiguous sentences” - i.e. those containing highly polysemous words and/or rare word senses. We also propose two ways to improve their disambiguation capabilities, through a) in-context learning and b) fine-tuning on carefully curated ambiguous datasets. Experiments show that our methods can match or outperform state-of-the-art systems such as DeepL and NLLB in four out of five language directions. Our research provides valuable insights into effectively adapting LLMs to become better disambiguators during Machine Translation. We release our curated disambiguation corpora and resources at https://data.statmt.org/ambiguous-europarl.- Anthology ID:
- 2023.wmt-1.44
- Volume:
- Proceedings of the Eighth Conference on Machine Translation
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 482–495
- Language:
- URL:
- https://aclanthology.org/2023.wmt-1.44
- DOI:
- 10.18653/v1/2023.wmt-1.44
- Cite (ACL):
- Vivek Iyer, Pinzhen Chen, and Alexandra Birch. 2023. Towards Effective Disambiguation for Machine Translation with Large Language Models. In Proceedings of the Eighth Conference on Machine Translation, pages 482–495, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Towards Effective Disambiguation for Machine Translation with Large Language Models (Iyer et al., WMT 2023)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2023.wmt-1.44.pdf