Correct-Detect: Balancing Performance and Ambiguity Through the Lens of Coreference Resolution in LLMs

Amber Shore, Russell Scheinberg, Ameeta Agrawal, So Young Lee


Abstract
Large Language Models (LLMs) are intended to reflect human linguistic competencies. But humans have access to a broad and embodied context, which is key in detecting and resolving linguistic ambiguities, even in isolated text spans. A foundational case of semantic ambiguity is found in the task of coreference resolution: how is a pronoun related to an earlier person mention? This capability is implicit in nearly every downstream task, and the presence of ambiguity at this level can alter performance significantly. We show that LLMs can achieve good performance with minimal prompting in both coreference disambiguation and the detection of ambiguity in coreference, however, they cannot do both at the same time. We present the CORRECT-DETECT trade-off: though models have both capabilities and deploy them implicitly, successful performance balancing these two abilities remains elusive.
Anthology ID:
2025.emnlp-main.1527
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30032–30046
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1527/
DOI:
Bibkey:
Cite (ACL):
Amber Shore, Russell Scheinberg, Ameeta Agrawal, and So Young Lee. 2025. Correct-Detect: Balancing Performance and Ambiguity Through the Lens of Coreference Resolution in LLMs. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 30032–30046, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Correct-Detect: Balancing Performance and Ambiguity Through the Lens of Coreference Resolution in LLMs (Shore et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1527.pdf
Checklist:
 2025.emnlp-main.1527.checklist.pdf