Evaluating Large Language Models for Belief Inference: Mapping Belief Networks at Scale

Trisevgeni Papakonstantinou, Antonina Zhiteneva, Ana Yutong Ma, Derek Powell, Zachary Horne


Abstract
Beliefs are interconnected, influencing how people process and update what they think. To study the interconnectedness of beliefs at scale, we introduce a novel analytical pipeline leveraging a finetuned GPT-4o model to infer belief structures from large-scale social media data. We evaluate the model’s performance by (1) comparing it to human annotated data (2) and its inferences to human-generated survey data. Our results show that a fine-tuned GPT-4o model can effectively recover belief structures, allowing for a level of scalability and efficiency that is impossible using traditional survey methods of data collection. This work demonstrates the potential for large language models to perform belief inference tasks and provides a framework for future research on the analysis of belief structures.
Anthology ID:
2025.findings-emnlp.1132
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20787–20795
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1132/
DOI:
10.18653/v1/2025.findings-emnlp.1132
Bibkey:
Cite (ACL):
Trisevgeni Papakonstantinou, Antonina Zhiteneva, Ana Yutong Ma, Derek Powell, and Zachary Horne. 2025. Evaluating Large Language Models for Belief Inference: Mapping Belief Networks at Scale. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 20787–20795, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Evaluating Large Language Models for Belief Inference: Mapping Belief Networks at Scale (Papakonstantinou et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.1132.pdf
Checklist:
 2025.findings-emnlp.1132.checklist.pdf