OkraLong: A Flexible Retrieval-Augmented Framework for Long-Text Question Answering

Yulong Hui, Yihao Liu, Yao Lu, Huanchen Zhang


Abstract
Large Language Models (LLMs) encounter challenges in efficiently answering long-text questions, as seen in applications like enterprise document analysis and financial report comprehension. While conventional solutions employ long-context processing or Retrieval-Augmented Generation (RAG), they suffer from prohibitive input expenses or incomplete information. Recent advancements adopt context compression and dynamic retrieval loops, but still sacrifice critical details or incur iterative costs. To address these limitations, we propose OkraLong, a novel framework that flexibly optimizes the entire processing workflow. Unlike prior static or coarse-grained adaptive strategies, OkraLong adopts fine-grained orchestration through three synergistic components: analyzer, organizer and executor. The analyzer characterizes the task states, which guide the organizer in dynamically scheduling the workflow. The executor carries out the execution and generates the final answer. Experimental results demonstrate that OkraLong not only enhances answer accuracy by 5.7%-41.2%, but also achieves cost savings of 1.3x-4.7x.
Anthology ID:
2025.findings-emnlp.890
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2025
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16395–16409
Language:
URL:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.890/
DOI:
10.18653/v1/2025.findings-emnlp.890
Bibkey:
Cite (ACL):
Yulong Hui, Yihao Liu, Yao Lu, and Huanchen Zhang. 2025. OkraLong: A Flexible Retrieval-Augmented Framework for Long-Text Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2025, pages 16395–16409, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
OkraLong: A Flexible Retrieval-Augmented Framework for Long-Text Question Answering (Hui et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-page-yu-wang-polytechnic/2025.findings-emnlp.890.pdf
Checklist:
 2025.findings-emnlp.890.checklist.pdf