Mao Hongbao


Fixing paper assignments

  1. Please select all papers that belong to the same person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2022

pdf bib
Transn’s Submissions to the WMT22 Translation Suggestion Task
Mao Hongbao | Zhang Wenbo | Cai Jie | Cheng Jianwei
Proceedings of the Seventh Conference on Machine Translation (WMT)

This paper describes the Transn’s submissions to the WMT2022 shared task on TranslationSuggestion. Our team participated on two tasks: Naive Translation Suggestion and TranslationSuggestion with Hints, focusing on two language directions Zh→En and En→Zh. Apart from the golden training data provided by the shared task, we utilized synthetic corpus to fine-tune on DeltaLM (∆LM), which is a pre-trained encoder-decoder language model. We applied two-stage training strategy on ∆LM and several effective methods to generate synthetic corpus, which contribute a lot to the results. According to the official evaluation results in terms of BLEU scores, our submissions in Naive Translation Suggestion En→Zh and Translation Suggestion with Hints (both Zh→En and En→Zh) ranked 1st, and Naive Translation Suggestion Zh→En also achieved comparable result to the best score.