Shlomo Berkovsky
2022
Few-shot fine-tuning SOTA summarization models for medical dialogues
David Fraile Navarro
|
Mark Dras
|
Shlomo Berkovsky
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop
Abstractive summarization of medical dialogues presents a challenge for standard training approaches, given the paucity of suitable datasets. We explore the performance of state-of-the-art models with zero-shot and few-shot learning strategies and measure the impact of pretraining with general domain and dialogue-specific text on the summarization performance.
Search