CUED@WMT19:EWC&LMs

Felix Stahlberg, Danielle Saunders, Adrià de Gispert, Bill Byrne


Abstract
Two techniques provide the fabric of the Cambridge University Engineering Department’s (CUED) entry to the WMT19 evaluation campaign: elastic weight consolidation (EWC) and different forms of language modelling (LMs). We report substantial gains by fine-tuning very strong baselines on former WMT test sets using a combination of checkpoint averaging and EWC. A sentence-level Transformer LM and a document-level LM based on a modified Transformer architecture yield further gains. As in previous years, we also extract n-gram probabilities from SMT lattices which can be seen as a source-conditioned n-gram LM.
Anthology ID:
W19-5340
Volume:
Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1)
Month:
August
Year:
2019
Address:
Florence, Italy
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
364–373
Language:
URL:
https://aclanthology.org/W19-5340
DOI:
10.18653/v1/W19-5340
Bibkey:
Cite (ACL):
Felix Stahlberg, Danielle Saunders, Adrià de Gispert, and Bill Byrne. 2019. CUED@WMT19:EWC&LMs. In Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1), pages 364–373, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
CUED@WMT19:EWC&LMs (Stahlberg et al., WMT 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/W19-5340.pdf