Joint Enhancement of Relational Reasoning for Long-Context LLMs

Zhirui Chen, Wei Shen, Jiashui Huang, Ling Shao


Abstract
Despite significant progress, large language models (LLMs) still struggle with long contexts due to memory limitations and their inability to tackle complex and long-context tasks. Additionally, LLMs often suffer from a lack of transparency and are prone to producing hallucinations. To address these challenges, we propose JERR, a novel framework designed to enhance long-context comprehension via graph-based reasoning in LLMs. JERR integrates three key components: synopsis extraction, graph construction, and relational reasoning. First, synopsis is extracted by chunking text strategically, allowing the model to summarize and understand information more efficiently. Second, we build a directed acyclic graph (DAG) to resolve redundancy, ensuring logical consistency and clarity. Finally, we incorporate Monte Carlo Tree Search (MCTS) to help the model navigate complex reasoning paths, ensuring more accurate and interpretable outputs. This framework provides a novel solution that enables LLMs to handle extended contexts and complex reasoning tasks with improved reliability and transparency. Experimental results show that JERR consistently outperforms all baselines on the ROUGE and F1 metrics, achieving the highest scores on the LLM-Rater evaluation.
Anthology ID:
2025.findings-emnlp.462
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8706–8720
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.462/
DOI:
10.18653/v1/2025.findings-emnlp.462
Bibkey:
Cite (ACL):
Zhirui Chen, Wei Shen, Jiashui Huang, and Ling Shao. 2025. Joint Enhancement of Relational Reasoning for Long-Context LLMs. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 8706–8720, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Joint Enhancement of Relational Reasoning for Long-Context LLMs (Chen et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.462.pdf
Checklist:
 2025.findings-emnlp.462.checklist.pdf