Team XSZ at BioLaySumm2025: Section-Wise Summarization, Retrieval-Augmented LLM, and Reinforcement Learning Fine-Tuning for Lay Summaries

Pengcheng Xu, Sicheng Shen, Jieli Zhou, Hongyi Xin


Abstract
We propose a unified, multi-stage lay summarization pipeline for BioLaySumm 2025 (Subtask 1.1) that (1) selects and summarizes key article sections via BioBART, (2) retrieves K-shot demonstrations using BGE embeddings for in-context Llama 3 8B prompting, (3) applies LoRA adapters to Llama 3 8B for supervised fine-tuning, (4) merges section summaries with a second BioBART pass, and (5) refines outputs through reinforcement learning (PPO & GRPO) using a composite reward of factuality (AlignScore, SummaC), relevance (ROUGE-L, BERTScore), and readability (LENS, FKGL, DCRS, CLI). On PLOS and eLife validation sets, our complete systemreduces DCRS from 9.23 to 8.56 and reduces CLI from 12.98 to 12.65, ranking 3rd in readability. and outperforms llama3 finetune baseline in AlignScore 0.722 to 0.862, ranking 5th in factuality, demonstrating balanced gains across readability, relevance, and factuality.
Anthology ID:
2025.bionlp-share.33
Volume:
Proceedings of the 24th Workshop on Biomedical Language Processing (Shared Tasks)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Sarvesh Soni, Dina Demner-Fushman
Venues:
BioNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
275–280
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.bionlp-share.33/
DOI:
10.18653/v1/2025.bionlp-share.33
Bibkey:
Cite (ACL):
Pengcheng Xu, Sicheng Shen, Jieli Zhou, and Hongyi Xin. 2025. Team XSZ at BioLaySumm2025: Section-Wise Summarization, Retrieval-Augmented LLM, and Reinforcement Learning Fine-Tuning for Lay Summaries. In Proceedings of the 24th Workshop on Biomedical Language Processing (Shared Tasks), pages 275–280, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Team XSZ at BioLaySumm2025: Section-Wise Summarization, Retrieval-Augmented LLM, and Reinforcement Learning Fine-Tuning for Lay Summaries (Xu et al., BioNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.bionlp-share.33.pdf