Enabling Real-Time Conversations with Minimal Training Costs

Wang Xu, Haoyu Wang, Shuo Wang, Weilin Zhao, Xu Han, Yukun Yan, Haiyan Zhao, Yudi Zhang, Zhe Tao, Zhiyuan Liu, Wanxiang Che


Abstract
"Large language models (LLMs) have demonstrated the ability to improve human efficiency through conversational interactions. Conventional LLM-powered dialogue systems, operating ona turn-based paradigm, preclude real-time interaction during response generation. To address this limitation, researchers have proposed duplex models. These models can dynamically adapt to user input, facilitating real-time interactive feedback. However, these methods typically require substantial computational resources to acquire the duplex capability. To reduce overhead, this paper presents a new duplex decoding approach that enhances LLMs with duplex ability, requiring minimal additional training. Specifically, our method employs parallel decoding of input and responses in conversations, effectively implementing a channel-division-multiplexing decoding strategy. Experimental results indicate that our proposed method significantly enhances the naturalness and human-likeness of user-AI interactions with minimal training costs."
Anthology ID:
2025.ccl-1.85
Volume:
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
Month:
August
Year:
2025
Address:
Jinan, China
Editors:
Maosong Sun, Peiyong Duan, Zhiyuan Liu, Ruifeng Xu, Weiwei Sun
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1136–1147
Language:
URL:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-1.85/
DOI:
Bibkey:
Cite (ACL):
Wang Xu, Haoyu Wang, Shuo Wang, Weilin Zhao, Xu Han, Yukun Yan, Haiyan Zhao, Yudi Zhang, Zhe Tao, Zhiyuan Liu, and Wanxiang Che. 2025. Enabling Real-Time Conversations with Minimal Training Costs. In Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025), pages 1136–1147, Jinan, China. Chinese Information Processing Society of China.
Cite (Informal):
Enabling Real-Time Conversations with Minimal Training Costs (Xu et al., CCL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-1.85.pdf