An Empirical Study of Iterative Refinements for Non-autoregressive Translation

Yisheng Xiao, Pei Guo, Zechen Sun, Juntao Li, Kai Song, Min Zhang


Abstract
Iterative non-autoregressive (NAR) models share a spirit of mixed autoregressive (AR) and fully NAR models, seeking a balance between generation quality and inference efficiency. These models have recently demonstrated impressive performance in varied generation tasks, surpassing the autoregressive Transformer. However, they also face several challenges that impede further development. In this work, we target building more efficient and competitive iterative NAR models. Firstly, we produce two simple metrics to identify the potential problems existing in current refinement processes, and look back on the various iterative NAR models to find the key factors for realizing our purpose. Subsequently, based on the analyses of the limitations of previous inference algorithms, we propose a simple yet effective strategy to conduct efficient refinements without performance declines. Experiments on five widely used datasets show that our final models set the new state-of-the-art performance compared to all previous NAR models, even with fewer decoding steps, and outperform AR Transformer by around one BLEU on average. Our codes and models are available on Github.
Anthology ID:
2025.acl-long.1443
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
29851–29865
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1443/
DOI:
Bibkey:
Cite (ACL):
Yisheng Xiao, Pei Guo, Zechen Sun, Juntao Li, Kai Song, and Min Zhang. 2025. An Empirical Study of Iterative Refinements for Non-autoregressive Translation. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 29851–29865, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
An Empirical Study of Iterative Refinements for Non-autoregressive Translation (Xiao et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.1443.pdf