Duanzhu Sangjie

Also published as: 端珠 桑杰


Fixing paper assignments

  1. Please select all papers that belong to the same person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2022

pdf bib
基于词典注入的藏汉机器翻译模型预训练方法(Dictionary Injection Based Pretraining Method for Tibetan-Chinese Machine Translation Model)
Duanzhu Sangjie (桑杰端珠) | Jia Cairang (才让加)
Proceedings of the 21st Chinese National Conference on Computational Linguistics

“近年来,预训练方法在自然语言处理领域引起了广泛关注,但是在比如藏汉机器等低资源的任务设定下,由于双语监督信息无法直接参与预训练,限制了预训练模型在此类任务上的性能改进。考虑到双语词典是丰富且廉价的先验翻译知识来源,同时受到跨语言交流中人们往往会使用混合语言增加以沟通效率这一现象启发,本文提出一种基于词典注入的藏汉机器翻译模型的预训练方法,为预训练提供学习双语知识关联的广泛可能。经验证,该方法在藏汉和汉藏翻译方向测试集上的 BLEU 值比 BART 强基准分别高出 2.3 和 2.1,证实了本文所提出的方法在藏汉机器翻译任务上的有效性。”