DiaDP@XLLM25: Advancing Chinese Dialogue Parsing via Unified Pretrained Language Models and Biaffine Dependency Scoring
Shuoqiu Duan, Xiaoliang Chen, Duoqian Miao, Xu Gu, Xianyong Li, Yajun Du
Abstract
Dialogue-level dependency parsing is crucial for understanding complex linguistic structures in conversational data, yet progress has been hindered by limited annotated resources and inadequate modeling of dialogue dynamics. Existing methods often fail to capture both intra- and inter-utterance dependencies effectively, particularly in languages like Chinese with rich contextual interactions. To address these challenges, we propose InterParser, a novel framework that integrates a pretrained language model (PLM), bidirectional GRU (BiGRU), and biaffine scoring for comprehensive dependency parsing. Our model encodes token sequences using a PLM, refines representations via deep BiGRU layers, and employs separate projections for “head” and “dependent” roles to optimize arc and relation prediction. For cross-utterance dependencies, speaker-specific feature projections are introduced to enhance dialogue-aware scoring. Joint training minimizes cross-entropy losses for both intra- and inter-utterance dependencies, ensuring unified optimization. Experiments on a standard Chinese benchmark demonstrate that InterParser significantly outperforms prior methods, achieving state-of-the-art labeled attachment scores (LAS) for both intra- and inter-utterance parsing.- Anthology ID:
- 2025.xllm-1.22
- Volume:
- Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025)
- Month:
- August
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Hao Fei, Kewei Tu, Yuhui Zhang, Xiang Hu, Wenjuan Han, Zixia Jia, Zilong Zheng, Yixin Cao, Meishan Zhang, Wei Lu, N. Siddharth, Lilja Øvrelid, Nianwen Xue, Yue Zhang
- Venues:
- XLLM | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 266–273
- Language:
- URL:
- https://preview.aclanthology.org/landing_page/2025.xllm-1.22/
- DOI:
- Cite (ACL):
- Shuoqiu Duan, Xiaoliang Chen, Duoqian Miao, Xu Gu, Xianyong Li, and Yajun Du. 2025. DiaDP@XLLM25: Advancing Chinese Dialogue Parsing via Unified Pretrained Language Models and Biaffine Dependency Scoring. In Proceedings of the 1st Joint Workshop on Large Language Models and Structure Modeling (XLLM 2025), pages 266–273, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- DiaDP@XLLM25: Advancing Chinese Dialogue Parsing via Unified Pretrained Language Models and Biaffine Dependency Scoring (Duan et al., XLLM 2025)
- PDF:
- https://preview.aclanthology.org/landing_page/2025.xllm-1.22.pdf