Compositional Translation: A Novel LLM-based Approach for Low-resource Machine Translation

Armel Randy Zebaze, Benoît Sagot, Rachel Bawden


Abstract
The ability of generative large language models (LLMs) to perform in-context learning has given rise to a large body of research into how best to prompt models for various natural language processing tasks. Machine Translation (MT) has been shown to benefit from in-context examples, in particular when they are semantically similar to the sentence to translate. In this paper, we propose a new LLM-based translation paradigm, compositional translation, to replace naive few-shot MT with similarity-based demonstrations. An LLM is used to decompose a sentence into simpler phrases, and then to translate each phrase with the help of retrieved demonstrations. Finally, the LLM is prompted to translate the initial sentence with the help of the self-generated phrase-translation pairs. Our intuition is that this approach should improve translation because these shorter phrases should be intrinsically easier to translate and easier to match with relevant examples. This is especially beneficial in low-resource scenarios, and more generally whenever the selection pool is small or out of domain. We show that compositional translation boosts LLM translation performance on a wide range of popular MT benchmarks, including FLORES200, NTREX 128 and TICO-19. Code and outputs will be made freely available.
Anthology ID:
2025.findings-emnlp.1216
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22328–22357
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1216/
DOI:
10.18653/v1/2025.findings-emnlp.1216
Bibkey:
Cite (ACL):
Armel Randy Zebaze, Benoît Sagot, and Rachel Bawden. 2025. Compositional Translation: A Novel LLM-based Approach for Low-resource Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 22328–22357, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Compositional Translation: A Novel LLM-based Approach for Low-resource Machine Translation (Zebaze et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1216.pdf
Checklist:
 2025.findings-emnlp.1216.checklist.pdf