2021
pdf
abs
The Mininglamp Machine Translation System for WMT21
Shiyu Zhao
|
Xiaopu Li
|
Minghui Wu
|
Jie Hao
Proceedings of the Sixth Conference on Machine Translation
This paper describes Mininglamp neural machine translation systems of the WMT2021 news translation tasks. We have participated in eight directions translation tasks for news text including Chinese to/from English, Hausa to/from English, German to/from English and French to/from German. Our fundamental system was based on Transformer architecture, with wider or smaller construction for different news translation tasks. We mainly utilized the method of back-translation, knowledge distillation and fine-tuning to boost single model, while the ensemble was used to combine single models. Our final submission has ranked first for the English to/from Hausa task.
2020
pdf
abs
OPPO’s Machine Translation System for the IWSLT 2020 Open Domain Translation Task
Qian Zhang
|
Xiaopu Li
|
Dawei Dang
|
Tingxun Shi
|
Di Ai
|
Zhengshan Xue
|
Jie Hao
Proceedings of the 17th International Conference on Spoken Language Translation
In this paper, we demonstrate our machine translation system applied for the Chinese-Japanese bidirectional translation task (aka. open domain translation task) for the IWSLT 2020. Our model is based on Transformer (Vaswani et al., 2017), with the help of many popular, widely proved effective data preprocessing and augmentation methods. Experiments show that these methods can improve the baseline model steadily and significantly.
pdf
abs
OPPO’s Machine Translation Systems for WMT20
Tingxun Shi
|
Shiyu Zhao
|
Xiaopu Li
|
Xiaoxue Wang
|
Qian Zhang
|
Di Ai
|
Dawei Dang
|
Xue Zhengshan
|
Jie Hao
Proceedings of the Fifth Conference on Machine Translation
In this paper we demonstrate our (OPPO’s) machine translation systems for the WMT20 Shared Task on News Translation for all the 22 language pairs. We will give an overview of the common aspects across all the systems firstly, including two parts: the data preprocessing part will show how the data are preprocessed and filtered, and the system part will show our models architecture and the techniques we followed. Detailed information, such as training hyperparameters and the results generated by each technique will be depicted in the corresponding subsections. Our final submissions ranked top in 6 directions (English ↔ Czech, English ↔ Russian, French → German and Tamil → English), third in 2 directions (English → German, English → Japanese), and fourth in 2 directions (English → Pashto and and English → Tamil).
2019
pdf
bib
abs
OPPO NMT System for IWSLT 2019
Xiaopu Li
|
Zhengshan Xue
|
Jie Hao
Proceedings of the 16th International Conference on Spoken Language Translation
This paper illustrates the OPPO's submission for IWSLT2019 text translation task Our system is based on Transformer architecture. Besides, we also study the effect of model ensembling. On the devsets of IWSLT 2019, the BLEU of our system reaches 19.94.