GRIT: Guided Relational Integration for Efficient Multi-Table Understanding

Yujin Kang, Park Seong Woo, Yoon-Sik Cho


Abstract
Recent advances in large language models (LLMs) have opened new possibilities for table-based tasks. However, most existing methods remain confined to single-table settings, limiting their applicability to real-world databases composed of multiple interrelated tables. In multi-table scenarios, LLMs face two key challenges: reasoning over relational structures beyond sequential text, and handling the input length limitations imposed by large-scale table concatenation. To address these issues, we propose Guided Relational Integration for multiple Tables (GRIT), a lightweight method that converts relational schemas into LLM-friendly textual representations. GRIT employs hashing-based techniques to efficiently infer primary–foreign key relationships and constructs prompts that explicitly encode relevant join paths and question-relevant columns. When applied to off-the-shelf LLMs, GRIT consistently improves table-column retrieval performance across diverse multi-table benchmarks while significantly reducing memory and computational overhead.
Anthology ID:
2025.emnlp-main.1118
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
21995–22008
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1118/
DOI:
Bibkey:
Cite (ACL):
Yujin Kang, Park Seong Woo, and Yoon-Sik Cho. 2025. GRIT: Guided Relational Integration for Efficient Multi-Table Understanding. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 21995–22008, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
GRIT: Guided Relational Integration for Efficient Multi-Table Understanding (Kang et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.1118.pdf
Checklist:
 2025.emnlp-main.1118.checklist.pdf