Automatic Inter-document Multi-hop Scientific QA Generation

Seungmin Lee, Dongha Kim, Yuni Jeon, Junyoung Koh, Min Song


Abstract
Existing automatic scientific question generation studies mainly focus on single-document factoid QA, overlooking the inter-document reasoning crucial for scientific understanding. We present AIM-SciQA, an automated framework for generating multi-document, multi-hop scientific QA datasets. AIM-SciQA extracts single-hop QAs using large language models (LLMs) with machine reading comprehension and constructs cross-document relations based on embedding-based semantic alignment while selectively leveraging citation information. Applied to 8,211 PubMed Central papers, it produced 411,409 single-hop and 13,672 multi-hop QAs, forming the IM-SciQA dataset. Human and automatic validation confirmed high factual consistency, and experimental results demonstrate that IM-SciQA effectively differentiates reasoning capabilities across retrieval and QA stages, providing a realistic and interpretable benchmark for retrieval-augmented scientific reasoning. We further extend this framework to construct CIM-SciQA, a citation-guided variant achieving comparable performance to the Oracle setting, reinforcing the dataset’s validity and generality.
Anthology ID:
2026.lrec-main.409
Volume:
Proceedings of the Fifteenth Language Resources and Evaluation Conference
Month:
May
Year:
2026
Address:
Palma de Mallorca, Spain
Editors:
Stelios Piperidis, Núria Bel, Henk van den Heuvel, Nancy Ide, Simon Krek, Antonio Toral
Venue:
LREC
SIG:
Publisher:
ELRA Language Resource Association
Note:
Pages:
5232–5245
Language:
URL:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.409/
DOI:
Bibkey:
Cite (ACL):
Seungmin Lee, Dongha Kim, Yuni Jeon, Junyoung Koh, and Min Song. 2026. Automatic Inter-document Multi-hop Scientific QA Generation. International Conference on Language Resources and Evaluation, main:5232–5245.
Cite (Informal):
Automatic Inter-document Multi-hop Scientific QA Generation (Lee et al., LREC 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-lrec/2026.lrec-main.409.pdf