Guided Alignment Training for Topic-Aware Neural Machine Translation

Wenhu Chen, Evgeny Matusov, Shahram Khadivi, Jan-Thorsten Peter


Abstract
In this paper, we propose an effective way for biasing the attention mechanism of a sequence-to-sequence neural machine translation (NMT) model towards the well-studied statistical word alignment models. We show that our novel guided alignment training approach improves translation quality on real-life e-commerce texts consisting of product titles and descriptions, overcoming the problems posed by many unknown words and a large type/token ratio. We also show that meta-data associated with input texts such as topic or category information can significantly improve translation quality when used as an additional signal to the decoder part of the network. With both novel features, the BLEU score of the NMT system on a product title set improves from 18.6 to 21.3%. Even larger MT quality gains are obtained through domain adaptation of a general domain NMT system to e-commerce data. The developed NMT system also performs well on the IWSLT speech translation task, where an ensemble of four variant systems outperforms the phrase-based baseline by 2.1% BLEU absolute.
Anthology ID:
2016.amta-researchers.10
Volume:
Conferences of the Association for Machine Translation in the Americas: MT Researchers' Track
Month:
October 28 - November 1
Year:
2016
Address:
Austin, TX, USA
Venue:
AMTA
SIG:
Publisher:
The Association for Machine Translation in the Americas
Note:
Pages:
121–134
Language:
URL:
https://aclanthology.org/2016.amta-researchers.10
DOI:
Bibkey:
Cite (ACL):
Wenhu Chen, Evgeny Matusov, Shahram Khadivi, and Jan-Thorsten Peter. 2016. Guided Alignment Training for Topic-Aware Neural Machine Translation. In Conferences of the Association for Machine Translation in the Americas: MT Researchers' Track, pages 121–134, Austin, TX, USA. The Association for Machine Translation in the Americas.
Cite (Informal):
Guided Alignment Training for Topic-Aware Neural Machine Translation (Chen et al., AMTA 2016)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2016.amta-researchers.10.pdf
Code
 wenhuchen/iwslt-2015-de-en-topics +  additional community code