基于大模型增强的两阶段高效事件共指消解方法

Wu Yaozong, Shuai Qi, Fangyuan Wang, Chenlong Bao, Jin-Tao Tang


Abstract
"本文针对两阶段事件共指消解方法存在的触发词词目启发机制缺乏同义词聚类能力和小模型理解触发词指代事件能力有限等问题,提出了一种基于大模型增强的两阶段高效的事件共指消解方法,一阶段引入大模型进行同义词聚类,二阶段大模型提供触发词解释文本增强小模型。此外,设计了引导小模型侧重触发词特征向量的损失函数。本文方法在保持近似线性时间复杂度的同时,在ECB+和GVC数据集上的CoNLLF1得分分别提升了2.9和8.0。"
Anthology ID:
2025.ccl-1.7
Volume:
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
Month:
August
Year:
2025
Address:
Jinan, China
Editors:
Maosong Sun, Peiyong Duan, Zhiyuan Liu, Ruifeng Xu, Weiwei Sun
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
77–88
Language:
URL:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-1.7/
DOI:
Bibkey:
Cite (ACL):
Wu Yaozong, Shuai Qi, Fangyuan Wang, Chenlong Bao, and Jin-Tao Tang. 2025. 基于大模型增强的两阶段高效事件共指消解方法. In Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025), pages 77–88, Jinan, China. Chinese Information Processing Society of China.
Cite (Informal):
基于大模型增强的两阶段高效事件共指消解方法 (Yaozong et al., CCL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-1.7.pdf