Abstract
Machine translation systems for high resource languages perform exceptionally well and produce high quality translations. Unfortunately, the vast majority of languages are not considered high resource and lack the quantity of parallel sentences needed to train such systems. These under-represented languages are not without resources, however, and bilingual dictionaries and grammar books are available as linguistic reference material. With current large language models (LLMs) supporting near book-length contexts, we can begin to use the available material to ensure advancements are shared among all of the world’s languages. In this paper, we demonstrate incorporating grammar books in the prompt of GPT-4 to improve machine translation and evaluate the performance on 16 topologically diverse low-resource languages, using a combination of reference material to show that the machine translation performance of LLMs can be improved using this method.- Anthology ID:
- 2024.emnlp-main.1127
- Volume:
- Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 20207–20219
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2024.emnlp-main.1127/
- DOI:
- 10.18653/v1/2024.emnlp-main.1127
- Cite (ACL):
- Jonathan Hus and Antonios Anastasopoulos. 2024. Back to School: Translation Using Grammar Books. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 20207–20219, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- Back to School: Translation Using Grammar Books (Hus & Anastasopoulos, EMNLP 2024)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2024.emnlp-main.1127.pdf