Minha Jhang


2025

pdf bib
Conditional [MASK] Discrete Diffusion Language Model
Hyukhun Koh | Minha Jhang | Dohyung Kim | Sangmook Lee | Kyomin Jung
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing

Although auto-regressive models excel in natural language processing, they often struggle to generate diverse text and provide limited controllability. Non-auto-regressive methods could be an alternative but often produce degenerate outputs and exhibit shortcomings in conditional generation. To address these challenges, we propose Diffusion-EAGS, a novel framework that integrates conditional masked language models into diffusion language models through the theoretical lens of a conditional Markov Random Field. In doing so, we propose entropy-adaptive Gibbs sampling and entropy-based noise scheduling to counterbalance each model’s shortcomings. Experimental results show that Diffusion-EAGS outperforms baselines and achieves the best quality-diversity tradeoff, demonstrating its effectiveness in non-autoregressive text generation.