Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors

Andrei Catalin Coman, Christos Theodoropoulos, Marie-Francine Moens, James Henderson


Abstract
We propose Fast-and-Frugal Text-Graph (FnF-TG) Transformers, a Transformer-based framework that unifies textual and structural information for inductive link prediction in text-attributed knowledge graphs. We demonstrate that, by effectively encoding ego-graphs (1-hop neighbourhoods), we can reduce the reliance on resource-intensive textual encoders. This makes the model both fast at training and inference time, as well as frugal in terms of cost. We perform a comprehensive evaluation on three popular datasets and show that FnF-TG can achieve superior performance compared to previous state-of-the-art methods. We also extend inductive learning to a fully inductive setting, where relations don’t rely on transductive (fixed) representations, as in previous work, but are a function of their textual description. Additionally, we introduce new variants of existing datasets, specifically designed to test the performance of models on unseen relations at inference time, thus offering a new test-bench for fully inductive link prediction.
Anthology ID:
2025.findings-acl.615
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11828–11841
Language:
URL:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.findings-acl.615/
DOI:
Bibkey:
Cite (ACL):
Andrei Catalin Coman, Christos Theodoropoulos, Marie-Francine Moens, and James Henderson. 2025. Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors. In Findings of the Association for Computational Linguistics: ACL 2025, pages 11828–11841, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Fast-and-Frugal Text-Graph Transformers are Effective Link Predictors (Coman et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/acl25-workshop-ingestion/2025.findings-acl.615.pdf