Machine Translation with Large Language Models: Prompting, Few-shot Learning, and Fine-tuning with QLoRA

Xuan Zhang, Navid Rajabi, Kevin Duh, Philipp Koehn


Abstract
While large language models have made remarkable advancements in natural language generation, their potential in machine translation, especially when fine-tuned, remains under-explored. In our study, we conduct comprehensive experiments, evaluating 15 publicly available language models on machine translation tasks. We compare the performance across three methodologies: zero-shot prompting, few-shot learning, and fine-tuning. Central to our approach is the use of QLoRA, an efficient fine-tuning method. On French-English, QLoRA fine-tuning outperforms both few-shot learning and models trained from scratch. This superiority is highlighted in both sentence-level and document-level translations, with a significant BLEU score improvement of 28.93 over the prompting method. Impressively, with QLoRA, the enhanced performance is achieved by fine-tuning a mere 0.77% of the model’s parameters.
Anthology ID:
2023.wmt-1.43
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
468–481
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.wmt-1.43/
DOI:
10.18653/v1/2023.wmt-1.43
Bibkey:
Cite (ACL):
Xuan Zhang, Navid Rajabi, Kevin Duh, and Philipp Koehn. 2023. Machine Translation with Large Language Models: Prompting, Few-shot Learning, and Fine-tuning with QLoRA. In Proceedings of the Eighth Conference on Machine Translation, pages 468–481, Singapore. Association for Computational Linguistics.
Cite (Informal):
Machine Translation with Large Language Models: Prompting, Few-shot Learning, and Fine-tuning with QLoRA (Zhang et al., WMT 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.wmt-1.43.pdf