Flexible-length Text Infilling for Discrete Diffusion Models

Andrew Zhang, Anushka Sivakumar, Chia-Wei Tang, Chris Thomas


Abstract
Discrete diffusion models are a new class of text generators that offer advantages such as bidirectional context use, parallelizable generation, and flexible prompting compared to autoregressive models. However, a critical limitation of discrete diffusion models is their inability to perform flexible-length or flexible-position text infilling without access to ground-truth positional data. We introduce DDOT (Discrete Diffusion with Optimal Transport Position Coupling), the first discrete diffusion model to overcome this challenge. DDOT jointly denoises token values and token positions, employing a novel sample-level Optimal Transport (OT) coupling. This coupling preserves relative token ordering while dynamically adjusting the positions and length of infilled segments, a capability previously missing in text diffusion. Our method is orthogonal to existing discrete text diffusion methods and is compatible with various pretrained text denoisers. Extensive experiments on text infilling benchmarks such as One-Billion-Word and Yelp demonstrate that DDOT outperforms naive diffusion baselines. Furthermore, DDOT achieves performance on par with state-of-the-art non-autoregressive models and enables significant improvements in training efficiency and flexibility.
Anthology ID:
2025.emnlp-main.1597
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
31332–31347
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1597/
DOI:
Bibkey:
Cite (ACL):
Andrew Zhang, Anushka Sivakumar, Chia-Wei Tang, and Chris Thomas. 2025. Flexible-length Text Infilling for Discrete Diffusion Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 31332–31347, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Flexible-length Text Infilling for Discrete Diffusion Models (Zhang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1597.pdf
Checklist:
 2025.emnlp-main.1597.checklist.pdf