Abstract
Medical systematic reviews play a vital role in healthcare decision making and policy. However, their production is time-consuming, limiting the availability of high-quality and up-to-date evidence summaries. Recent advancements in LLMs offer the potential to automatically generate literature reviews on demand, addressing this issue. However, LLMs sometimes generate inaccurate (and potentially misleading) texts by hallucination or omission. In healthcare, this can make LLMs unusable at best and dangerous at worst. We conducted 16 interviews with international systematic review experts to characterize the perceived utility and risks of LLMs in the specific context of medical evidence reviews. Experts indicated that LLMs can assist in the writing process by drafting summaries, generating templates, distilling information, and crosschecking information. They also raised concerns regarding confidently composed but inaccurate LLM outputs and other potential downstream harms, including decreased accountability and proliferation of low-quality reviews. Informed by this qualitative analysis, we identify criteria for rigorous evaluation of biomedical LLMs aligned with domain expert views.- Anthology ID:
- 2023.emnlp-main.626
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 10122–10139
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.626
- DOI:
- 10.18653/v1/2023.emnlp-main.626
- Cite (ACL):
- Hye Yun, Iain Marshall, Thomas Trikalinos, and Byron Wallace. 2023. Appraising the Potential Uses and Harms of LLMs for Medical Systematic Reviews. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 10122–10139, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Appraising the Potential Uses and Harms of LLMs for Medical Systematic Reviews (Yun et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/improve-issue-templates/2023.emnlp-main.626.pdf