Li Zeng
2023
BIT-ACT: An Ancient Chinese Translation System Using Data Augmentation
Li Zeng
|
Yanzhi Tian
|
Yingyu Shan
|
Yuhang Guo
Proceedings of ALT2023: Ancient Language Translation Workshop
This paper describes a translation model for ancient Chinese to modern Chinese and English for the Evahan 2023 competition, a subtask of the Ancient Language Translation 2023 challenge. During the training of our model, we applied various data augmentation techniques and used SiKu-RoBERTa as part of our model architecture. The results indicate that back translation improves the model’s performance, but double back translation introduces noise and harms the model’s performance. Fine-tuning on the original dataset can be helpful in solving the issue.