Marion Thaler
2025
Construction-Based Reduction of Translationese for Low-Resource Languages: A Pilot Study on Bavarian
Peiqin Lin
|
Marion Thaler
|
Daniela.goschala@campus.lmu.de Daniela.goschala@campus.lmu.de
|
Amir Hossein Kargaran
|
Yihong Liu
|
Andre Martins
|
Hinrich Schuetze
Proceedings of the 7th Workshop on Research in Computational Linguistic Typology and Multilingual NLP
When translating into a low-resource language, a language model can have a tendency to produce translations that are close to the source (e.g., word-by-word translations) due to a lack of rich low-resource training data in pretraining. Thus, the output often is translationese that differs considerably from what native speakers would produce naturally. To remedy this, we synthetically create a training set in which the frequency of a construction unique to the low-resource language is artificially inflated. For the case of Bavarian, we show that, after training, the language model has learned the unique construction and that native speakers judge its output as more natural. Our pilot study suggests that construction-based mitigation of translationese is a promising approach. Code and artifacts are available at https://github.com/cisnlp/BayernGPT.