BJTU-WeChat’s Systems for the WMT22 Chat Translation Task
Yunlong Liang, Fandong Meng, Jinan Xu, Yufeng Chen, Jie Zhou
Abstract
This paper introduces the joint submission of the Beijing Jiaotong University and WeChat AI to the WMT’22 chat translation task for English-German. Based on the Transformer, we apply several effective variants. In our experiments, we apply the pre-training-then-fine-tuning paradigm. In the first pre-training stage, we employ data filtering and synthetic data generation (i.e., back-translation, forward-translation, and knowledge distillation). In the second fine-tuning stage, we investigate speaker-aware in-domain data generation, speaker adaptation, prompt-based context modeling, target denoising fine-tuning, and boosted self-COMET-based model ensemble. Our systems achieve 81.0 and 94.6 COMET scores on English-German and German-English, respectively. The COMET scores of English-German and German-English are the highest among all submissions.- Anthology ID:
- 2022.wmt-1.91
- Volume:
- Proceedings of the Seventh Conference on Machine Translation (WMT)
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates (Hybrid)
- Venue:
- WMT
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 955–961
- Language:
- URL:
- https://aclanthology.org/2022.wmt-1.91
- DOI:
- Cite (ACL):
- Yunlong Liang, Fandong Meng, Jinan Xu, Yufeng Chen, and Jie Zhou. 2022. BJTU-WeChat’s Systems for the WMT22 Chat Translation Task. In Proceedings of the Seventh Conference on Machine Translation (WMT), pages 955–961, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
- Cite (Informal):
- BJTU-WeChat’s Systems for the WMT22 Chat Translation Task (Liang et al., WMT 2022)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2022.wmt-1.91.pdf