Enhancing Rumor Detection Methods with Propagation Structure Infused Language Model

Chaoqun Cui, Siyuan Li, Kunkun Ma, Caiyan Jia


Abstract
Pretrained Language Models (PLMs) have excelled in various Natural Language Processing tasks, benefiting from large-scale pretraining and self-attention mechanism’s ability to capture long-range dependencies. However, their performance on social media application tasks like rumor detection remains suboptimal. We attribute this to mismatches between pretraining corpora and social texts, inadequate handling of unique social symbols, and pretraining tasks ill-suited for modeling user engagements implicit in propagation structures. To address these issues, we propose a continue pretraining strategy called Post Engagement Prediction (PEP) to infuse information from propagation structures into PLMs. PEP makes models to predict root, branch, and parent relations between posts, capturing interactions of stance and sentiment crucial for rumor detection. We also curate and release large-scale Twitter corpus: TwitterCorpus (269GB text), and two unlabeled claim conversation datasets with propagation structures (UTwitter and UWeibo). Utilizing these resources and PEP strategy, we train a Twitter-tailored PLM called SoLM. Extensive experiments demonstrate PEP significantly boosts rumor detection performance across universal and social media PLMs, even in few-shot scenarios. On benchmark datasets, PEP enhances baseline models by 1.0-3.7% accuracy, even enabling it to outperform current state-of-the-art methods on multiple datasets. SoLM alone, without high-level modules, also achieves competitive results, highlighting the strategy’s effectiveness in learning discriminative post interaction features.
Anthology ID:
2025.coling-main.478
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7165–7179
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.coling-main.478/
DOI:
Bibkey:
Cite (ACL):
Chaoqun Cui, Siyuan Li, Kunkun Ma, and Caiyan Jia. 2025. Enhancing Rumor Detection Methods with Propagation Structure Infused Language Model. In Proceedings of the 31st International Conference on Computational Linguistics, pages 7165–7179, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Enhancing Rumor Detection Methods with Propagation Structure Infused Language Model (Cui et al., COLING 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.coling-main.478.pdf