How to Contextualize Empirical Data for Risk Analysis with LLMs: A Case Study of Power Outages

Haiyun Huang, Yukun Li, Marco A Pretell, Jacob Naroian, Ebadah Khan, Liping Liu


Abstract
Large Language Models (LLMs) are increasingly being considered for high-stakes decision-making, yet their application in statistical risk analysis remains largely underexplored. A central challenge in this domain is enabling LLMs to effectively leverage historical data. To address this, we propose novel methods for extracting key information from raw data and translating it into structured contextual input within the LLM prompt. Applying our methods to a case study of power outage risk assessment, we demonstrate that this contextualization strategy significantly improves the LLM’s performance in risk assessment tasks. While the LLM’s prediction performance still does not match that of a standard machine learning model, the LLM-based approach offers distinct advantages in versatility and interpretability. These findings demonstrate a new paradigm for contextualizing data to support risk assessment.
Anthology ID:
2026.findings-eacl.324
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6158–6172
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.324/
DOI:
Bibkey:
Cite (ACL):
Haiyun Huang, Yukun Li, Marco A Pretell, Jacob Naroian, Ebadah Khan, and Liping Liu. 2026. How to Contextualize Empirical Data for Risk Analysis with LLMs: A Case Study of Power Outages. In Findings of the Association for Computational Linguistics: EACL 2026, pages 6158–6172, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
How to Contextualize Empirical Data for Risk Analysis with LLMs: A Case Study of Power Outages (Huang et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.324.pdf
Checklist:
 2026.findings-eacl.324.checklist.pdf