Can Large Language Models Act as Ensembler for Multi-GNNs?

Hanqi Duan, Yao Cheng, Jianxiang Yu, Yao Liu, Xiang Li


Abstract
Graph Neural Networks (GNNs) have emerged as powerful models for learning from graph-structured data. However, GNNs lack the inherent semantic understanding capability of rich textual node attributes, limiting their effectiveness in applications. On the other hand, we empirically observe that for existing GNN models, no one can consistently outperforms others across diverse datasets. In this paper, we study whether LLMs can act as an ensembler for multi-GNNs and propose the LensGNN model. The model first aligns multiple GNNs, mapping the representations of different GNNs into the same space. Then, through LoRA fine-tuning, it aligns the space between the GNN and the LLM, injecting graph tokens and textual information into LLMs. This allows LensGNN to ensemble multiple GNNs and take advantage of the strengths of LLM, leading to a deeper understanding of both textual semantic information and graph structural information. The experimental results show that LensGNN outperforms existing models. This research advances text-attributed graph ensemble learning by providing a robust and superior solution for integrating semantic and structural information. We provide our code and data here: https://github.com/AquariusAQ/LensGNN.
Anthology ID:
2025.emnlp-main.1470
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
28863–28882
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1470/
DOI:
Bibkey:
Cite (ACL):
Hanqi Duan, Yao Cheng, Jianxiang Yu, Yao Liu, and Xiang Li. 2025. Can Large Language Models Act as Ensembler for Multi-GNNs?. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 28863–28882, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Can Large Language Models Act as Ensembler for Multi-GNNs? (Duan et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1470.pdf
Checklist:
 2025.emnlp-main.1470.checklist.pdf