BDA-UC3M @ BioLaySumm: Efficient Lay Summarization with Small-Scale SoTA LLMs

Ilyass Ramzi, Isabel Bedmar


Abstract
This paper presents an efficient system for the BioLaySumm 2025 Shared Task on biomedical lay summarization. The approach leverages compact, state-of-the-art language models (4–7 billion parameters), including Gemma3 4B, Qwen3 4B, and GPT-4.1-mini, optimized for relevance, readability, and factuality. Through dynamic 4-bit quantization, parameter-efficient fine-tuning, advanced extractive preprocessing, and direct preference optimization, the system achieves performance competitive with much larger baselines. Comprehensive experiments on the eLife and PLOS datasets demonstrate that small language models can deliver high-quality, accessible biomedical summaries using modest computational resources. The findings suggest that resource-efficient models can help democratize access to scientific information, supporting broader scientific communication goals.
Anthology ID:
2025.bionlp-share.30
Volume:
Proceedings of the 24th Workshop on Biomedical Language Processing (Shared Tasks)
Month:
August
Year:
2025
Address:
Vienna, Austria
Editors:
Sarvesh Soni, Dina Demner-Fushman
Venues:
BioNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
249–255
Language:
URL:
https://preview.aclanthology.org/landing_page/2025.bionlp-share.30/
DOI:
10.18653/v1/2025.bionlp-share.30
Bibkey:
Cite (ACL):
Ilyass Ramzi and Isabel Bedmar. 2025. BDA-UC3M @ BioLaySumm: Efficient Lay Summarization with Small-Scale SoTA LLMs. In Proceedings of the 24th Workshop on Biomedical Language Processing (Shared Tasks), pages 249–255, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
BDA-UC3M @ BioLaySumm: Efficient Lay Summarization with Small-Scale SoTA LLMs (Ramzi & Bedmar, BioNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2025.bionlp-share.30.pdf