Beyond the Haystack: Sensitivity to Context in Legal Reference Recall

Eric Xia, Karthik Srikumar, Keshav Karthik, Advaith Renjith, Ashwinee Panda


Abstract
Reference retrieval is critical for many applications in the legal domain, for instance in determining which case texts support a particular claim. However, existing benchmarking methods do not rigorously enable evaluation of recall capabilities in previously unseen contexts. We develop an evaluation framework from U.S. court opinions which ensures models have no prior knowledge of case results or context. Applying our framework, we identify an consistent gap across models and tasks between traditional needle-in-a-haystack retrieval and actual performance in legal recall. Our work shows that standard needle-in-a-haystack benchmarks consistently overestimate recall performance in the legal domain. By isolating the causes of performance degradation to contextual informativity rather than distributional differences, our findings highlight the need for specialized testing in reference-critical applications, and establish an evaluation framework for improving retrieval across informativity levels.
Anthology ID:
2025.nllp-1.5
Volume:
Proceedings of the Natural Legal Language Processing Workshop 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Nikolaos Aletras, Ilias Chalkidis, Leslie Barrett, Cătălina Goanță, Daniel Preoțiuc-Pietro, Gerasimos Spanakis
Venues:
NLLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
48–53
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.nllp-1.5/
DOI:
Bibkey:
Cite (ACL):
Eric Xia, Karthik Srikumar, Keshav Karthik, Advaith Renjith, and Ashwinee Panda. 2025. Beyond the Haystack: Sensitivity to Context in Legal Reference Recall. In Proceedings of the Natural Legal Language Processing Workshop 2025, pages 48–53, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Beyond the Haystack: Sensitivity to Context in Legal Reference Recall (Xia et al., NLLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.nllp-1.5.pdf