SLENDER: Structured Outputs for SLM-based NER in Low-Resource Englishes

Nicole Ren, James Teo


Abstract
Named Entity Recognition (NER) for low-resource variants of English remains challenging, as most NER models are trained on datasets predominantly focused on American or British English. While recent work has shown that proprietary Large Language Models (LLMs) can perform NER effectively in low-resource settings through in-context learning, practical deployment is limited by their high computational costs and privacy concerns. Open-source Small Language Models (SLMs) offer promising alternatives, but the tendency of these Language Models (LM) to hallucinate poses challenges for production use. To address this, we introduce SLENDER, a novel output format for LM-based NER that achieves a three-fold reduction in inference time on average compared to JSON format, which is widely used for structured outputs. Our approach using Gemma-2-9B-it with the SLENDER output format and constrained decoding in zero-shot settings outperforms the en_core_web_trf model from SpaCy, an industry-standard NER tool, in all five regions of the Worldwide test set.
Anthology ID:
2025.acl-industry.59
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Georg Rehm, Yunyao Li
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
836–849
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.acl-industry.59/
DOI:
Bibkey:
Cite (ACL):
Nicole Ren and James Teo. 2025. SLENDER: Structured Outputs for SLM-based NER in Low-Resource Englishes. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 6: Industry Track), pages 836–849, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
SLENDER: Structured Outputs for SLM-based NER in Low-Resource Englishes (Ren & Teo, ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.acl-industry.59.pdf