Targeted Source Text Editing for Machine Translation: Exploiting Quality Estimators and Large Language Models

Hyuga Koretaka, Atsushi Fujita, Tomoyuki Kajiwara


Abstract
To improve the translation quality of “black-box” machine translation (MT) systems,we focus on the automatic editing of source texts to be translated.In addition to the use of a large language model (LLM) to implement robust and accurate editing,we investigate the usefulness of targeted editing, i.e., instructing the LLM with a text span to be edited.Our method determines such source text spans using a span-level quality estimator, which identifies actual translation errors caused by the MT system of interest, and a word aligner, which identifies alignments between the tokens in the source text and translation hypothesis.Our empirical experiments with eight MT systems and ten test datasets for four translation directionsconfirmed the efficacy of our method in improving translation quality.Through analyses, we identified several characteristics of our method andthat the segment-level quality estimator is a vital component of our method.
Anthology ID:
2025.wmt-1.12
Volume:
Proceedings of the Tenth Conference on Machine Translation
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Barry Haddow, Tom Kocmi, Philipp Koehn, Christof Monz
Venue:
WMT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
200–219
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.12/
DOI:
Bibkey:
Cite (ACL):
Hyuga Koretaka, Atsushi Fujita, and Tomoyuki Kajiwara. 2025. Targeted Source Text Editing for Machine Translation: Exploiting Quality Estimators and Large Language Models. In Proceedings of the Tenth Conference on Machine Translation, pages 200–219, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Targeted Source Text Editing for Machine Translation: Exploiting Quality Estimators and Large Language Models (Koretaka et al., WMT 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.wmt-1.12.pdf