Context and Copying in Neural Machine Translation

Rebecca Knowles, Philipp Koehn

[How to correct problems with metadata yourself]


Abstract
Neural machine translation systems with subword vocabularies are capable of translating or copying unknown words. In this work, we show that they learn to copy words based on both the context in which the words appear as well as features of the words themselves. In contexts that are particularly copy-prone, they even copy words that they have already learned they should translate. We examine the influence of context and subword features on this and other types of copying behavior.
Anthology ID:
D18-1339
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3034–3041
Language:
URL:
https://aclanthology.org/D18-1339
DOI:
10.18653/v1/D18-1339
Bibkey:
Cite (ACL):
Rebecca Knowles and Philipp Koehn. 2018. Context and Copying in Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3034–3041, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Context and Copying in Neural Machine Translation (Knowles & Koehn, EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/D18-1339.pdf