Format Inertia: A Failure Mechanism of LLMs in Medical Pre-Consultation

Seungseop Lim, Gibaeg Kim, Wooseok Han, Jean Seo, Hyunkyung Lee, Jaehyo Yoo, Eunho Yang


Abstract
Recent advances in Large Language Models (LLMs) have brought significant improvements to various service domains, including chatbots and medical pre-consultation applications. In the healthcare domain, the most common approach for adapting LLMs to multi-turn dialogue generation is Supervised Fine-Tuning (SFT). However, datasets for SFT in tasks like medical pre-consultation typically exhibit a skewed turn-count distribution. Training on such data induces a novel failure mechanism we term **Format Inertia**, where models tend to generate repetitive, format-correct, but diagnostically uninformative questions in long medical dialogues. To mitigate this observed failure mechanism, we adopt a simple, data-centric method that rebalances the turn-count distribution of the training dataset. Experimental results show that our approach substantially alleviates Format Inertia in medical pre-consultation.
Anthology ID:
2025.emnlp-industry.101
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1437–1450
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.101/
DOI:
Bibkey:
Cite (ACL):
Seungseop Lim, Gibaeg Kim, Wooseok Han, Jean Seo, Hyunkyung Lee, Jaehyo Yoo, and Eunho Yang. 2025. Format Inertia: A Failure Mechanism of LLMs in Medical Pre-Consultation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 1437–1450, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Format Inertia: A Failure Mechanism of LLMs in Medical Pre-Consultation (Lim et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.101.pdf