Tree-Structured Non-Autoregressive Decoding for Sequence-to-Sequence Text Generation

Pengyu Ji, Yufei Liu, Xiang Hu, Kewei Tu


Abstract
Autoregressive Transformer (AT) dominates sequence-to-sequence generation tasks but suffers from high inference latency due to sequential token generation. Non-Autoregressive Transformer (NAT) improves inference efficiency by parallelizing token prediction, yet degrades generation quality. To address these limitations, we propose Tree-structured Non-Autoregressive Decoding (TNAD), a novel paradigm that bridges autoregressive and non-autoregressive decoding. TNAD generates a sentence through a top-down, layer-wise expansion of its constituency parse tree, enabling parallel generation within each layer while preserving contextual dependencies across layers. Experimental results on machine translation and paraphrase generation demonstrate that TNAD outperforms AT in efficiency and NAT in generation quality, thus offering a new alternative to AT and NAT in the trade-off between efficiency and quality. Our code is publicly available at https://github.com/jipy0222/TNAD.
Anthology ID:
2025.findings-emnlp.327
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6168–6174
Language:
URL:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.findings-emnlp.327/
DOI:
10.18653/v1/2025.findings-emnlp.327
Bibkey:
Cite (ACL):
Pengyu Ji, Yufei Liu, Xiang Hu, and Kewei Tu. 2025. Tree-Structured Non-Autoregressive Decoding for Sequence-to-Sequence Text Generation. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 6168–6174, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Tree-Structured Non-Autoregressive Decoding for Sequence-to-Sequence Text Generation (Ji et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/name-variant-enfa-fane/2025.findings-emnlp.327.pdf
Checklist:
 2025.findings-emnlp.327.checklist.pdf