UIC at ArchEHR-QA 2025: Tri-Step Pipeline for Reliable Grounded Medical Question Answering
Mohammad Arvan, Anuj Gautam, Mohan Zalake, Karl M. Kochendorfer
Abstract
Automated response generation from electronic health records (EHRs) holds potential to reduce clinician workload, but it introduces important challenges related to factual accuracy and reliable grounding in clinical evidence. We present a structured three-step pipeline that uses large language models (LLMs) for evidence classification, guided response generation, and iterative quality control. To enable rigorous evaluation, our framework combines traditional reference-based metrics with a claim-level “LLM-as-a-Judge” methodology. On the ArchEHR-QA benchmark, our system achieves 82.0 percent claim-level evidence faithfulness and 51.6 percent citation-level factuality, demonstrating strong performance in generating clinically grounded responses. These findings highlight the utility of structured LLM pipelines in healthcare applications, while also underscoring the importance of transparent evaluation and continued refinement. All code, prompt templates, and evaluation tools are publicly available.- Anthology ID:
- 2025.bionlp-share.14
- Volume:
- BioNLP 2025 Shared Tasks
- Month:
- August
- Year:
- 2025
- Address:
- Vienna, Austria
- Editors:
- Sarvesh Soni, Dina Demner-Fushman
- Venues:
- BioNLP | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 110–117
- Language:
- URL:
- https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bionlp-share.14/
- DOI:
- Cite (ACL):
- Mohammad Arvan, Anuj Gautam, Mohan Zalake, and Karl M. Kochendorfer. 2025. UIC at ArchEHR-QA 2025: Tri-Step Pipeline for Reliable Grounded Medical Question Answering. In BioNLP 2025 Shared Tasks, pages 110–117, Vienna, Austria. Association for Computational Linguistics.
- Cite (Informal):
- UIC at ArchEHR-QA 2025: Tri-Step Pipeline for Reliable Grounded Medical Question Answering (Arvan et al., BioNLP 2025)
- PDF:
- https://preview.aclanthology.org/acl25-workshop-ingestion/2025.bionlp-share.14.pdf