System Report for CCL25-Eval Task 4: Factivity Inference Based on Dynamic Few-Shot Learning

Sunyan Gu, Taoyu Lu, Siqi Liu, Kan Guo, Yan Shao


Abstract
"This paper presents the implementation approach we employ in the First Chinese Factivity Inference Evaluation 2025 (FIE2025). Factivity inference (FI) is a semantic understanding task related to judging the truth value of events, based on the use of semantic verbal elements, such as “believe”, “falsely claim”, “realize”. We approach factivity inference as a large language model(LLM) based task. We aim to enhance LLM’s discriminative capability by adequately integrating the task-specific information via prompts, as well as constructing dynamic few-shot datasets for fine-tuning. Additionally, we incorporate data augmentation and ensemble strategies to further boost the performance. Our approach achieves a score of 93.41% in the official evaluation of the shared task, ranking second in the leaderboard."
Anthology ID:
2025.ccl-2.15
Volume:
Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025)
Month:
August
Year:
2025
Address:
Jinan, China
Editors:
Hongfei Lin, Bin Li, Hongye Tan
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
128–133
Language:
URL:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-2.15/
DOI:
Bibkey:
Cite (ACL):
Sunyan Gu, Taoyu Lu, Siqi Liu, Kan Guo, and Yan Shao. 2025. System Report for CCL25-Eval Task 4: Factivity Inference Based on Dynamic Few-Shot Learning. In Proceedings of the 24th China National Conference on Computational Linguistics (CCL 2025), pages 128–133, Jinan, China. Chinese Information Processing Society of China.
Cite (Informal):
System Report for CCL25-Eval Task 4: Factivity Inference Based on Dynamic Few-Shot Learning (Gu et al., CCL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-ccl/2025.ccl-2.15.pdf