Abstract
This paper presents a simple method that extends a standard Transformer-based autoregressive decoder, to speed up decoding. The proposed method generates a token from the head and tail of a sentence (two tokens in total) in each step. By simultaneously generating multiple tokens that rarely depend on each other, the decoding speed is increased while the degradation in translation quality is minimized. In our experiments, the proposed method increased the translation speed by around 113%-155% in comparison with a standard autoregressive decoder, while degrading the BLEU scores by no more than 1.03. It was faster than an iterative non-autoregressive decoder in many conditions.- Anthology ID:
- 2020.wat-1.3
- Volume:
- Proceedings of the 7th Workshop on Asian Translation
- Month:
- December
- Year:
- 2020
- Address:
- Suzhou, China
- Venue:
- WAT
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 50–57
- Language:
- URL:
- https://aclanthology.org/2020.wat-1.3
- DOI:
- Cite (ACL):
- Kenji Imamura and Eiichiro Sumita. 2020. Transformer-based Double-token Bidirectional Autoregressive Decoding in Neural Machine Translation. In Proceedings of the 7th Workshop on Asian Translation, pages 50–57, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Transformer-based Double-token Bidirectional Autoregressive Decoding in Neural Machine Translation (Imamura & Sumita, WAT 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.wat-1.3.pdf
- Data
- ASPEC, WMT 2014