Beyond BLEU: Training Neural Machine Translation with Semantic Similarity

John Wieting, Taylor Berg-Kirkpatrick, Kevin Gimpel, Graham Neubig


Abstract
While most neural machine translation (NMT)systems are still trained using maximum likelihood estimation, recent work has demonstrated that optimizing systems to directly improve evaluation metrics such as BLEU can significantly improve final translation accuracy. However, training with BLEU has some limitations: it doesn’t assign partial credit, it has a limited range of output values, and it can penalize semantically correct hypotheses if they differ lexically from the reference. In this paper, we introduce an alternative reward function for optimizing NMT systems that is based on recent work in semantic similarity. We evaluate on four disparate languages trans-lated to English, and find that training with our proposed metric results in better translations as evaluated by BLEU, semantic similarity, and human evaluation, and also that the optimization procedure converges faster. Analysis suggests that this is because the proposed metric is more conducive to optimization, assigning partial credit and providing more diversity in scores than BLEU
Anthology ID:
P19-1427
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4344–4355
Language:
URL:
https://aclanthology.org/P19-1427
DOI:
10.18653/v1/P19-1427
Bibkey:
Cite (ACL):
John Wieting, Taylor Berg-Kirkpatrick, Kevin Gimpel, and Graham Neubig. 2019. Beyond BLEU: Training Neural Machine Translation with Semantic Similarity. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4344–4355, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Beyond BLEU: Training Neural Machine Translation with Semantic Similarity (Wieting et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/P19-1427.pdf
Supplementary:
 P19-1427.Supplementary.pdf
Video:
 https://preview.aclanthology.org/ml4al-ingestion/P19-1427.mp4
Data
WMT 2018