Type-Less yet Type-Aware Inductive Link Prediction with Pretrained Language Models

Alessandro De Bellis, Salvatore Bufi, Giovanni Servedio, Vito Walter Anelli, Tommaso Di Noia, Eugenio Di Sciascio


Abstract
Inductive link prediction is emerging as a key paradigm for real-world knowledge graphs (KGs), where new entities frequently appear and models must generalize to them without retraining. Predicting links in a KG faces the challenge of guessing previously unseen entities by leveraging generalizable node features such as subgraph structure, type annotations, and ontological constraints. However, explicit type information is often lacking or incomplete. Even when available, type information in most KGs is often coarse-grained, sparse, and prone to errors due to human annotation. In this work, we explore the potential of pre-trained language models (PLMs) to enrich node representations with implicit type signals. We introduce TyleR, a Type-less yet type-awaRe approach for subgraph-based inductive link prediction that leverages PLMs for semantic enrichment. Experiments on standard benchmarks demonstrate that TyleR outperforms state-of-the-art baselines in scenarios with scarce type annotations and sparse graph connectivity. To ensure reproducibility, we share our code at https://github.com/sisinflab/tyler .
Anthology ID:
2025.emnlp-main.1383
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
27181–27197
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1383/
DOI:
Bibkey:
Cite (ACL):
Alessandro De Bellis, Salvatore Bufi, Giovanni Servedio, Vito Walter Anelli, Tommaso Di Noia, and Eugenio Di Sciascio. 2025. Type-Less yet Type-Aware Inductive Link Prediction with Pretrained Language Models. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 27181–27197, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Type-Less yet Type-Aware Inductive Link Prediction with Pretrained Language Models (De Bellis et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1383.pdf
Checklist:
 2025.emnlp-main.1383.checklist.pdf