Entity Tracking in Small Language Models: An Attention-Based Study of Parameter-Efficient Fine-Tuning

Sungho Jeon, Michael Strube


Abstract
The ability to track entities is fundamental for language understanding, yet the internal mechanisms governing this capability in Small Language Models (SLMs) are poorly understood. Previous studies often rely on indirect probing or complex interpretability methods, leaving a gap for lightweight diagnostics that connect model behavior to performance. To bridge this gap, we introduce a framework to analyze entity tracking by measuring the attention flow between entity and non-entity tokens within SLMs. We apply this to analyze models both before and after Parameter-Efficient Fine-Tuning (PEFT). Our analysis reveals two key findings. First, SLMs’ attentional strategies vary significantly with text type, but entities consistently receive a high degree of focus. Second, we show that PEFT – specifically QLoRA – dramatically improves classification performance on entity-centric tasks by increasing the model’s attentional focus on entity-related tokens. Our work provides direct evidence for how PEFT can refine a model’s internal mechanisms and establishes attention analysis as a valuable, lightweight diagnostic tool for interpreting and improving SLMs.
Anthology ID:
2025.codi-1.4
Volume:
Proceedings of the 6th Workshop on Computational Approaches to Discourse, Context and Document-Level Inferences (CODI 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Michael Strube, Chloe Braud, Christian Hardmeier, Junyi Jessy Li, Sharid Loaiciga, Amir Zeldes, Chuyuan Li
Venues:
CODI | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
42–53
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.codi-1.4/
DOI:
Bibkey:
Cite (ACL):
Sungho Jeon and Michael Strube. 2025. Entity Tracking in Small Language Models: An Attention-Based Study of Parameter-Efficient Fine-Tuning. In Proceedings of the 6th Workshop on Computational Approaches to Discourse, Context and Document-Level Inferences (CODI 2025), pages 42–53, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Entity Tracking in Small Language Models: An Attention-Based Study of Parameter-Efficient Fine-Tuning (Jeon & Strube, CODI 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.codi-1.4.pdf
Supplementarymaterial:
 2025.codi-1.4.SupplementaryMaterial.zip