TreeSwap: Data Augmentation for Machine Translation via Dependency Subtree Swapping

Attila Nagy, Dorina Lakatos, Botond Barta, Judit Ács


Abstract
Data augmentation methods for neural machine translation are particularly useful when limited amount of training data is available, which is often the case when dealing with low-resource languages. We introduce a novel augmentation method, which generates new sentences by swapping objects and subjects across bisentences. This is performed simultaneously based on the dependency parse trees of the source and target sentences. We name this method TreeSwap. Our results show that TreeSwap achieves consistent improvements over baseline models in 4 language pairs in both directions on resource-constrained datasets. We also explore domain-specific corpora, but find that our method does not make significant improvements on law, medical and IT data. We report the scores of similar augmentation methods and find that TreeSwap performs comparably. We also analyze the generated sentences qualitatively and find that the augmentation produces a correct translation in most cases. Our code is available on Github.
Anthology ID:
2023.ranlp-1.82
Volume:
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
759–768
Language:
URL:
https://aclanthology.org/2023.ranlp-1.82
DOI:
Bibkey:
Cite (ACL):
Attila Nagy, Dorina Lakatos, Botond Barta, and Judit Ács. 2023. TreeSwap: Data Augmentation for Machine Translation via Dependency Subtree Swapping. In Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing, pages 759–768, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
TreeSwap: Data Augmentation for Machine Translation via Dependency Subtree Swapping (Nagy et al., RANLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.ranlp-1.82.pdf