Conditional [MASK] Discrete Diffusion Language Model

Hyukhun Koh, Minha Jhang, Dohyung Kim, Sangmook Lee, Kyomin Jung


Abstract
Although auto-regressive models excel in natural language processing, they often struggle to generate diverse text and provide limited controllability. Non-auto-regressive methods could be an alternative but often produce degenerate outputs and exhibit shortcomings in conditional generation. To address these challenges, we propose Diffusion-EAGS, a novel framework that integrates conditional masked language models into diffusion language models through the theoretical lens of a conditional Markov Random Field. In doing so, we propose entropy-adaptive Gibbs sampling and entropy-based noise scheduling to counterbalance each model’s shortcomings. Experimental results show that Diffusion-EAGS outperforms baselines and achieves the best quality-diversity tradeoff, demonstrating its effectiveness in non-autoregressive text generation.
Anthology ID:
2025.emnlp-main.450
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8910–8934
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.450/
DOI:
Bibkey:
Cite (ACL):
Hyukhun Koh, Minha Jhang, Dohyung Kim, Sangmook Lee, and Kyomin Jung. 2025. Conditional [MASK] Discrete Diffusion Language Model. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 8910–8934, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Conditional [MASK] Discrete Diffusion Language Model (Koh et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.450.pdf
Checklist:
 2025.emnlp-main.450.checklist.pdf