Abstract
Non-autoregressive models achieve significant decoding speedup in neural machine translation but lack the ability to capture sequential dependency. Directed Acyclic Transformer (DA-Transformer) was recently proposed to model sequential dependency with a directed acyclic graph. Consequently, it has to apply a sequential decision process at inference time, which harms the global translation accuracy. In this paper, we present a Viterbi decoding framework for DA-Transformer, which guarantees to find the joint optimal solution for the translation and decoding path under any length constraint. Experimental results demonstrate that our approach consistently improves the performance of DA-Transformer while maintaining a similar decoding speedup.- Anthology ID:
- 2022.findings-emnlp.322
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2022
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, United Arab Emirates
- Editors:
- Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 4390–4397
- Language:
- URL:
- https://aclanthology.org/2022.findings-emnlp.322
- DOI:
- 10.18653/v1/2022.findings-emnlp.322
- Cite (ACL):
- Chenze Shao, Zhengrui Ma, and Yang Feng. 2022. Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 4390–4397, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
- Cite (Informal):
- Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation (Shao et al., Findings 2022)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2022.findings-emnlp.322.pdf