Training in Step-by-Step Formal Reasoning Improves Pronominal Reasoning in Language Models

Vagrant Gautam


Abstract
Large reasoning models are trained to solve problems by decomposing them into steps.While they show impressive progress on reasoning tasks, "reasoning" here is typically limited to formal reasoning, i.e., math, code, and logic.An open question is whether these abilities transfer to _pronominal reasoning_, where step-by-step thinking in non-reasoning models worsens performance, but code pre-training may help.I answer this question by evaluating six pairs of original and DeepSeek-distilled models (1.5B-70B parameters) on six challenging datasets for English pronoun resolution (identifying whom a pronoun refers to) and pronoun fidelity (learning and applying a pronoun mapping correctly).Performance improves statistically significantly on all datasets (31% relative increase), indicating that distilling step-by-step formal reasoning does in fact help with pronominal reasoning, in part by improving instruction-following.With a qualitative evaluation of 720 generations, I show that improvements occur across granular error types, and come from plausible-looking reasoning chains employing a variety of reasoning strategies.However, the gains put models just above random performance on these datasets, leaving plenty of room for improvement.
Anthology ID:
2026.eacl-short.7
Volume:
Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
121–135
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-short.7/
DOI:
Bibkey:
Cite (ACL):
Vagrant Gautam. 2026. Training in Step-by-Step Formal Reasoning Improves Pronominal Reasoning in Language Models. In Proceedings of the 19th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 121–135, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Training in Step-by-Step Formal Reasoning Improves Pronominal Reasoning in Language Models (Gautam, EACL 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.eacl-short.7.pdf
Checklist:
 2026.eacl-short.7.checklist.pdf