USST’s System for AutoSimTrans 2022

Zhu Hui, Yu Jun


Abstract
This paper describes our submitted text-to-text Simultaneous translation (ST) system, which won the second place in the Chinese→English streaming translation task of AutoSimTrans 2022. Our baseline system is a BPE-based Transformer model trained with the PaddlePaddle framework. In our experiments, we employ data synthesis and ensemble approaches to enhance the base model. In order to bridge the gap between general domain and spoken domain, we select in-domain data from general corpus and mixed then with spoken corpus for mixed fine tuning. Finally, we adopt fixed wait-k policy to transfer our full-sentence translation model to simultaneous translation model. Experiments on the development data show that our system outperforms than the baseline system.
Anthology ID:
2022.autosimtrans-1.7
Volume:
Proceedings of the Third Workshop on Automatic Simultaneous Translation
Month:
July
Year:
2022
Address:
Online
Editors:
Julia Ive, Ruiqing Zhang
Venue:
AutoSimTrans
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
43–49
Language:
URL:
https://aclanthology.org/2022.autosimtrans-1.7
DOI:
10.18653/v1/2022.autosimtrans-1.7
Bibkey:
Cite (ACL):
Zhu Hui and Yu Jun. 2022. USST’s System for AutoSimTrans 2022. In Proceedings of the Third Workshop on Automatic Simultaneous Translation, pages 43–49, Online. Association for Computational Linguistics.
Cite (Informal):
USST’s System for AutoSimTrans 2022 (Hui & Jun, AutoSimTrans 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.autosimtrans-1.7.pdf
Data
BSTC