Relation-Aware Prompting Makes Large Language Models Effective Zero-shot Relation Extractors

Mahdi Rahimi, Razvan-Gabriel Dumitru, Mihai Surdeanu


Abstract
While supervised relation extraction (RE) models have considerably advanced the state-of-the-art, they often perform poorly in low-resource settings. Zero-shot RE is vital when annotations are not available either due to costs or time constraints. As a result, zero-shot RE has garnered interest in the research community. With the advent of large language models (LLMs) many approaches have been proposed for prompting LLMs for RE, but these methods often either rely on an accompanying small language model (e.g., for finetuning on synthetic data generated by LLMs) or require complex post-prompt processing. In this paper, we propose an effective prompt-based method that does not require any additional resources. Instead, we use an LLM to perform a two-step process. In the first step, we perform a targeted summarization of the text with respect to the underlying relation, reduce the applicable label space, and synthesize examples. Then, we combine the products of these processes with other elements into a final prompt. We evaluate our approach with various LLMs on four real-world RE datasets. Our evaluation shows that our method outperforms the previous state-of-the-art zero-shot methods by a large margin. This work can also be considered as a new strong baseline for zero-shot RE that is compatible with any LLM.
Anthology ID:
2025.starsem-1.22
Volume:
Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Lea Frermann, Mark Stevenson
Venue:
*SEM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
280–292
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.22/
DOI:
Bibkey:
Cite (ACL):
Mahdi Rahimi, Razvan-Gabriel Dumitru, and Mihai Surdeanu. 2025. Relation-Aware Prompting Makes Large Language Models Effective Zero-shot Relation Extractors. In Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025), pages 280–292, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Relation-Aware Prompting Makes Large Language Models Effective Zero-shot Relation Extractors (Rahimi et al., *SEM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.22.pdf