Jusheng Zhang
2025
DrDiff: Dynamic Routing Diffusion with Hierarchical Attention for Breaking the Efficiency-Quality Trade-off
Jusheng Zhang
|
Yijia Fan
|
Kaitong Cai
|
Zimeng Huang
|
Xiaofei Sun
|
Jian Wang
|
Chengpei Tang
|
Keze Wang
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
This paper introduces DrDiff, a novel framework for long-text generation that overcomes the efficiency-quality trade-off through three core technologies. First, we design a dynamic expert scheduling mechanism that intelligently allocates computational resources during the diffusion process based on text complexity, enabling more efficient handling of text generation tasks of varying difficulty. Second, we introduce a Hierarchical Sparse Attention (HSA) mechanism that adaptively adjusts attention patterns according to a variety of input lengths, reducing computational complexity from O(n2) to O(n) while maintaining model performance. Finally, we propose a Semantic Anchor States (SAS) module that combines with DPM-solver++ to reduce diffusion steps, significantly improving generation speed. Comprehensive experiments on various long-text generation benchmarks demonstrate the superiority of our DrDiff over the existing SOTA methods.
Search
Fix author
Co-authors
- Kaitong Cai 1
- Yijia Fan 1
- Zimeng Huang 1
- Xiaofei Sun 1
- Chengpei Tang 1
- show all...