@inproceedings{fujita-takada-2025-improving,
    title = "Improving Low-Resource {J}apanese Translation with Fine-Tuning and Backtranslation for the {WMT} 25 General Translation Task",
    author = "Fujita, Felipe  and
      Takada, Hideyuki",
    editor = "Haddow, Barry  and
      Kocmi, Tom  and
      Koehn, Philipp  and
      Monz, Christof",
    booktitle = "Proceedings of the Tenth Conference on Machine Translation",
    month = nov,
    year = "2025",
    address = "Suzhou, China",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.52/",
    pages = "765--768",
    ISBN = "979-8-89176-341-8",
    abstract = "In this paper, we explore the effectiveness of combining fine-tuning and backtranslation on a small Japanese corpus for neural machine translation. Starting from a baseline English{\textrightarrow}Japanese model (COMET = 0.460), we first apply backtranslation (BT) using synthetic data generated from monolingual Japanese corpora, yielding a modest increase (COMET = 0.468). Next, we fine-tune (FT) the model on a genuine small parallel dataset drawn from diverse Japanese news and literary corpora, achieving a substantial jump to COMET = 0.589 when using Mistral 7B. Finally, we integrate both backtranslation and fine-tuning{---}first augmenting the small dataset with BT generated examples, then adapting via FT{---}which further boosts performance to COMET = 0.597. These results demonstrate that, even with limited training data, the synergistic use of backtranslation and targeted fine-tuning on Japanese corpora can significantly enhance translation quality, outperforming each technique in isolation. This approach offers a lightweight yet powerful strategy for improving low-resource language pairs."
}Markdown (Informal)
[Improving Low-Resource Japanese Translation with Fine-Tuning and Backtranslation for the WMT 25 General Translation Task](https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.52/) (Fujita & Takada, WMT 2025)
ACL