Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation

Xing Niu, Weijia Xu, Marine Carpuat


Abstract
We aim to better exploit the limited amounts of parallel text available in low-resource settings by introducing a differentiable reconstruction loss for neural machine translation (NMT). This loss compares original inputs to reconstructed inputs, obtained by back-translating translation hypotheses into the input language. We leverage differentiable sampling and bi-directional NMT to train models end-to-end, without introducing additional parameters. This approach achieves small but consistent BLEU improvements on four language pairs in both translation directions, and outperforms an alternative differentiable reconstruction strategy based on hidden states.
Anthology ID:
N19-1043
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
442–448
Language:
URL:
https://aclanthology.org/N19-1043
DOI:
10.18653/v1/N19-1043
Bibkey:
Cite (ACL):
Xing Niu, Weijia Xu, and Marine Carpuat. 2019. Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 442–448, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation (Niu et al., NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/N19-1043.pdf
Code
 xingniu/sockeye