Iterative Refinement in the Continuous Space for Non-Autoregressive Neural Machine Translation

Jason Lee, Raphael Shu, Kyunghyun Cho


Abstract
We propose an efficient inference procedure for non-autoregressive machine translation that iteratively refines translation purely in the continuous space. Given a continuous latent variable model for machine translation (Shu et al., 2020), we train an inference network to approximate the gradient of the marginal log probability of the target sentence, using the latent variable instead. This allows us to use gradient-based optimization to find the target sentence at inference time that approximately maximizes its marginal probability. As each refinement step only involves computation in the latent space of low dimensionality (we use 8 in our experiments), we avoid computational overhead incurred by existing non-autoregressive inference procedures that often refine in token space. We compare our approach to a recently proposed EM-like inference procedure (Shu et al., 2020) that optimizes in a hybrid space, consisting of both discrete and continuous variables. We evaluate our approach on WMT’14 En→De, WMT’16 Ro→En and IWSLT’16 De→En, and observe two advantages over the EM-like inference: (1) it is computationally efficient, i.e. each refinement step is twice as fast, and (2) it is more effective, resulting in higher marginal probabilities and BLEU scores with the same number of refinement steps. On WMT’14 En→De, for instance, our approach is able to decode 6.2 times faster than the autoregressive model with minimal degradation to translation quality (0.9 BLEU).
Anthology ID:
2020.emnlp-main.73
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1006–1015
Language:
URL:
https://aclanthology.org/2020.emnlp-main.73
DOI:
10.18653/v1/2020.emnlp-main.73
Bibkey:
Cite (ACL):
Jason Lee, Raphael Shu, and Kyunghyun Cho. 2020. Iterative Refinement in the Continuous Space for Non-Autoregressive Neural Machine Translation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1006–1015, Online. Association for Computational Linguistics.
Cite (Informal):
Iterative Refinement in the Continuous Space for Non-Autoregressive Neural Machine Translation (Lee et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.emnlp-main.73.pdf
Optional supplementary material:
 2020.emnlp-main.73.OptionalSupplementaryMaterial.zip
Video:
 https://slideslive.com/38938904
Code
 zomux/lanmt-ebm