Meta-Semantics Augmented Few-Shot Relational Learning

Han Wu, Jie Yin


Abstract
Few-shot relational learning on knowledge graph (KGs) aims to perform reasoning over relations with only a few training examples. While current methods have focused primarily on leveraging specific relational information, rich semantics inherent in KGs have been largely overlooked. To bridge this gap, we propose PromptMeta, a novel prompted meta-learning framework that seamlessly integrates meta-semantics with relational information for few-shot relational learning. PromptMeta introduces two core innovations: (1) a Meta-Semantic Prompt (MSP) pool that learns and consolidates high-level meta-semantics shared across tasks, enabling effective knowledge transfer and adaptation to newly emerging relations; and (2) a learnable fusion mechanism that dynamically combines meta-semantics with task-specific relational information tailored to different few-shot tasks. Both components are optimized jointly with model parameters within a meta-learning framework. Extensive experiments and analyses on two real-world KG benchmarks validate the effectiveness of PromptMeta in adapting to new relations with limited supervision.
Anthology ID:
2025.emnlp-main.1569
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
30811–30823
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1569/
DOI:
Bibkey:
Cite (ACL):
Han Wu and Jie Yin. 2025. Meta-Semantics Augmented Few-Shot Relational Learning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 30811–30823, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Meta-Semantics Augmented Few-Shot Relational Learning (Wu & Yin, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1569.pdf
Checklist:
 2025.emnlp-main.1569.checklist.pdf