Enhanced Zero-Shot Machine Translation via Fixed Prefix Pair Bootstrapping

Van-Hien Tran, Masao Utiyama


Abstract
Zero-shot in-context learning allows large language models (LLMs) to perform tasks using only provided instructions. However, pre-trained LLMs often face calibration issues in zero-shot scenarios, leading to challenges such as hallucinations and off-target translations that compromise output quality, particularly in machine translation (MT). This paper introduces a new method to improve zero-shot MT using fixed prefix pair bootstrapping. By initializing translations with an accurate bilingual prefix pair at the start of both source and target sentences, this approach effectively guides the model to generate precise target-language outputs. Extensive evaluations across four model architectures and multiple translation directions demonstrate significant and consistent improvements, showcasing the potential of this straightforward strategy to enhance zero-shot MT performance.
Anthology ID:
2025.loresmt-1.2
Volume:
Proceedings of the Eighth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2025)
Month:
May
Year:
2025
Address:
Albuquerque, New Mexico, U.S.A.
Editors:
Atul Kr. Ojha, Chao-hong Liu, Ekaterina Vylomova, Flammie Pirinen, Jonathan Washington, Nathaniel Oco, Xiaobing Zhao
Venues:
LoResMT | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10–15
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.loresmt-1.2/
DOI:
Bibkey:
Cite (ACL):
Van-Hien Tran and Masao Utiyama. 2025. Enhanced Zero-Shot Machine Translation via Fixed Prefix Pair Bootstrapping. In Proceedings of the Eighth Workshop on Technologies for Machine Translation of Low-Resource Languages (LoResMT 2025), pages 10–15, Albuquerque, New Mexico, U.S.A.. Association for Computational Linguistics.
Cite (Informal):
Enhanced Zero-Shot Machine Translation via Fixed Prefix Pair Bootstrapping (Tran & Utiyama, LoResMT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.loresmt-1.2.pdf