Enhancing Automatic Term Extraction with Large Language Models via Syntactic Retrieval
Yongchan Chun, Minhyuk Kim, Dongjun Kim, Chanjun Park, Heuiseok Lim
Abstract
Automatic Term Extraction (ATE) identifies domain-specific expressions that are crucial for downstream tasks such as machine translation and information retrieval. Although large language models (LLMs) have significantly advanced various NLP tasks, their potential for ATE has scarcely been examined. We propose a retrieval-based prompting strategy that, in the few-shot setting, selects demonstrations according to syntactic rather than semantic similarity. This syntactic retrieval method is domain-agnostic and provides more reliable guidance for capturing term boundaries. We evaluate the approach in both in-domain and cross-domain settings, analyzing how lexical overlap between the query sentence and its retrieved examples affects performance. Experiments on three specialized ATE benchmarks show that syntactic retrieval improves F1-score. These findings highlight the importance of syntactic cues when adapting LLMs to terminology-extraction tasks.- Anthology ID:
- 2025.findings-acl.516
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2025
- Month:
- July
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 9916–9926
- Language:
- URL:
- https://preview.aclanthology.org/landing_page/2025.findings-acl.516/
- DOI:
- Cite (ACL):
- Yongchan Chun, Minhyuk Kim, Dongjun Kim, Chanjun Park, and Heuiseok Lim. 2025. Enhancing Automatic Term Extraction with Large Language Models via Syntactic Retrieval. In Findings of the Association for Computational Linguistics: ACL 2025, pages 9916–9926, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- Enhancing Automatic Term Extraction with Large Language Models via Syntactic Retrieval (Chun et al., Findings 2025)
- PDF:
- https://preview.aclanthology.org/landing_page/2025.findings-acl.516.pdf