Introducing Graph Context into Language Models through Parameter-Efficient Fine-Tuning for Lexical Relation Mining

Jingwen Sun, Zhiyi Tian, Yu He, Jingwei Sun, Guangzhong Sun


Abstract
Lexical relation refers to the way words are related within a language. Prior work has demonstrated that pretrained language models (PLMs) can effectively mine lexical relations between word pairs. However, they overlook the potential of graph structures composed of lexical relations, which can be integrated with the semantic knowledge of PLMs. In this work, we propose a parameter-efficient fine-tuning method through graph context, which integrates graph features and semantic representations for lexical relation classification (LRC) and lexical entailment (LE) tasks. Our experiments show that graph features can help PLMs better understand more complex lexical relations, establishing a new state-of-the-art for LRC and LE. Finally, we perform an error analysis, identifying the bottlenecks of language models in lexical relation mining tasks and providing insights for future improvements.
Anthology ID:
2025.acl-long.511
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10359–10374
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.511/
DOI:
Bibkey:
Cite (ACL):
Jingwen Sun, Zhiyi Tian, Yu He, Jingwei Sun, and Guangzhong Sun. 2025. Introducing Graph Context into Language Models through Parameter-Efficient Fine-Tuning for Lexical Relation Mining. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 10359–10374, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Introducing Graph Context into Language Models through Parameter-Efficient Fine-Tuning for Lexical Relation Mining (Sun et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.511.pdf