Nini Xiao
2022
TSMind: Alibaba and Soochow University’s Submission to the WMT22 Translation Suggestion Task
Xin Ge
|
Ke Wang
|
Jiayi Wang
|
Nini Xiao
|
Xiangyu Duan
|
Yu Zhao
|
Yuqi Zhang
Proceedings of the Seventh Conference on Machine Translation (WMT)
This paper describes the joint submission of Alibaba and Soochow University to the WMT 2022 Shared Task on Translation Suggestion (TS). We participate in the English to/from German and English to/from Chinese tasks. Basically, we utilize the model paradigm fine-tuning on the downstream tasks based on large-scale pre-trained models, which has recently achieved great success. We choose FAIR’s WMT19 English to/from German news translation system and MBART50 for English to/from Chinese as our pre-trained models. Considering the task’s condition of limited use of training data, we follow the data augmentation strategies provided by Yang to boost our TS model performance. And we further involve the dual conditional cross-entropy model and GPT-2 language model to filter augmented data. The leader board finally shows that our submissions are ranked first in three of four language directions in the Naive TS task of the WMT22 Translation Suggestion task.
2021
基于层间知识蒸馏的神经机器翻译(Inter-layer Knowledge Distillation for Neural Machine Translation)
Chang Jin (金畅)
|
Renchong Duan (段仁翀)
|
Nini Xiao (肖妮妮)
|
Xiangyu Duan (段湘煜)
Proceedings of the 20th Chinese National Conference on Computational Linguistics
神经机器翻译(NMT)通常采用多层神经网络模型结构,随着网络层数的加深,所得到的特征也越来越抽象,但是在现有的神经机器翻译模型中,高层的抽象信息仅在预测分布时被利用。为了更好地利用这些信息,本文提出了层间知识蒸馏,目的在于将高层网络的抽象知识迁移到低层网络,使低层网络能够捕捉更加有用的信息,从而提升整个模型的翻译质量。区别于传统教师模型和学生模型的知识蒸馏,层间知识蒸馏实现的是同一个模型内部不同层之间的知识迁移。通过在中文-英语、英语-罗马尼亚语、德语-英语三个数据集上的实验,结果证明层间蒸馏方法能够有效提升翻译性能,分别在中-英、英-罗、德-英上提升1.19,0.72,1.35的BLEU值,同时也证明有效地利用高层信息能够提高神经网络模型的翻译质量。
Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon Induction
Jinpeng Zhang
|
Baijun Ji
|
Nini Xiao
|
Xiangyu Duan
|
Min Zhang
|
Yangbin Shi
|
Weihua Luo
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
Search
Co-authors
- Xiangyu Duan 3
- Chang Jin (金畅) 1
- Renchong Duan (段仁翀) 1
- Jinpeng Zhang 1
- Baijun Ji 1
- show all...