IIGROUP Submissions for WMT22 Word-Level AutoCompletion Task

Cheng Yang, Siheng Li, Chufan Shi, Yujiu Yang


Abstract
This paper presents IIGroup’s submission to the WMT22 Word-Level AutoCompletion(WLAC) Shared Task in four language directions. We propose to use a Generate-then-Rerank framework to solve this task. More specifically, the generator is used to generate candidate words and recall as many positive candidates as possible. To facilitate the training process of the generator, we propose a span-level mask prediction task. Once we get the candidate words, we take the top-K candidates and feed them into the reranker. The reranker is used to select the most confident candidate. The experimental results in four language directions demonstrate the effectiveness of our systems. Our systems achieve competitive performance ranking 1st in English to Chinese subtask and 2nd in Chinese to English subtask.
Anthology ID:
2022.wmt-1.121
Volume:
Proceedings of the Seventh Conference on Machine Translation (WMT)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Philipp Koehn, Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Tom Kocmi, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri, Aurélie Névéol, Mariana Neves, Martin Popel, Marco Turchi, Marcos Zampieri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1187–1191
Language:
URL:
https://aclanthology.org/2022.wmt-1.121
DOI:
Bibkey:
Cite (ACL):
Cheng Yang, Siheng Li, Chufan Shi, and Yujiu Yang. 2022. IIGROUP Submissions for WMT22 Word-Level AutoCompletion Task. In Proceedings of the Seventh Conference on Machine Translation (WMT), pages 1187–1191, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
IIGROUP Submissions for WMT22 Word-Level AutoCompletion Task (Yang et al., WMT 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/2022.wmt-1.121.pdf