Cross-domain Rumor Detection via Test-Time Adaptation and Large Language Models

Yuxia Gong, Shuguo Hu, Huaiwen Zhang


Abstract
Rumor detection on social media has become crucial due to the rapid spread of misinformation. Existing approaches primarily focus on within-domain tasks, resulting in suboptimal performance in cross-domain scenarios due to domain shift. To address this limitation, we draw inspiration from the strong generalization capabilities of Test-Time Adaptation (TTA) and propose a novel framework to enhance rumor detection performance across different domains. Specifically, we introduce Test-Time Adaptation for Rumor Detection (T2ARD), which incorporates both single-domain model and target graph adaptation strategies tailored to the unique requirements of cross-domain rumor detection. T2ARD utilizes a graph adaptation module that updates the graph structure and node attributes through multi-level self-supervised contrastive learning, aiming to derive invariant graph representations. To mitigate the impact of significant distribution shifts on self-supervised signals, T2ARD performs model adaptation by using annotations from Large Language Models (LLMs) on target graph to produce pseudo-labels as supervised signals. Experiments conducted on four widely used cross-domain datasets demonstrate that T2ARD achieves state-of-the-art performance, surpassing existing methods in rumor detection.
Anthology ID:
2025.emnlp-main.407
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8062–8077
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.407/
DOI:
Bibkey:
Cite (ACL):
Yuxia Gong, Shuguo Hu, and Huaiwen Zhang. 2025. Cross-domain Rumor Detection via Test-Time Adaptation and Large Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 8062–8077, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Cross-domain Rumor Detection via Test-Time Adaptation and Large Language Models (Gong et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.407.pdf
Checklist:
 2025.emnlp-main.407.checklist.pdf