Synergizing LLMs with Global Label Propagation for Multimodal Fake News Detection

Shuguo Hu, Jun Hu, Huaiwen Zhang


Abstract
Large Language Models (LLMs) can assist multimodal fake news detection by predicting pseudo labels. However, LLM-generated pseudo labels alone demonstrate poor performance compared to traditional detection methods, making their effective integration non-trivial. In this paper, we propose Global Label Propagation Network with LLM-based Pseudo Labeling (GLPN-LLM) for multimodal fake news detection, which integrates LLM capabilities via label propagation techniques. The global label propagation can utilize LLM-generated pseudo labels, enhancing prediction accuracy by propagating label information among all samples. For label propagation, a mask-based mechanism is designed to prevent label leakage during training by ensuring that training nodes do not propagate their own labels back to themselves. Experimental results on benchmark datasets show that by synergizing LLMs with label propagation, our model achieves superior performance over state-of-the-art baselines.
Anthology ID:
2025.acl-long.72
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1426–1440
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.72/
DOI:
Bibkey:
Cite (ACL):
Shuguo Hu, Jun Hu, and Huaiwen Zhang. 2025. Synergizing LLMs with Global Label Propagation for Multimodal Fake News Detection. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1426–1440, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Synergizing LLMs with Global Label Propagation for Multimodal Fake News Detection (Hu et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.72.pdf