What Matters to an LLM? Behavioral and Computational Evidences from Summarization

Yongxin Zhou, Changshun Wu, Philippe Mulhem, Didier Schwab, Maxime Peyrard


Abstract
Large Language Models (LLMs) are now state-of-the-art at summarization, yet the internal notion of importance that drives their information selections remains hidden. We propose to investigate this by combining behavioral and computational analyses. Behaviorally, we generate a series of length-controlled summaries for each document and derive empirical importance distributions based on how often each information unit is selected. These reveal that LLMs converge on consistent importance patterns, sharply different from pre-LLM baselines, and that LLMs cluster more by family than by size. Computationally, we identify that certain attention heads align well with empirical importance distributions, and that middle-to-late layers are strongly predictive of importance. Together, these results provide initial insights into *what* LLMs prioritize in summarization and *how* this priority is internally represented, opening a path toward interpreting and ultimately controlling information selection in these models.
Anthology ID:
2026.findings-eacl.302
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5712–5737
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.302/
DOI:
Bibkey:
Cite (ACL):
Yongxin Zhou, Changshun Wu, Philippe Mulhem, Didier Schwab, and Maxime Peyrard. 2026. What Matters to an LLM? Behavioral and Computational Evidences from Summarization. In Findings of the Association for Computational Linguistics: EACL 2026, pages 5712–5737, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
What Matters to an LLM? Behavioral and Computational Evidences from Summarization (Zhou et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.302.pdf
Checklist:
 2026.findings-eacl.302.checklist.pdf