Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand Rare Biomedical Words

Haochun Wang, Chi Liu, Nuwa Xi, Sendong Zhao, Meizhi Ju, Shiwei Zhang, Ziheng Zhang, Yefeng Zheng, Bing Qin, Ting Liu


Abstract
Prompt-based fine-tuning for pre-trained models has proven effective for many natural language processing tasks under few-shot settings in general domain. However, tuning with prompt in biomedical domain has not been investigated thoroughly. Biomedical words are often rare in general domain, but quite ubiquitous in biomedical contexts, which dramatically deteriorates the performance of pre-trained models on downstream biomedical applications even after fine-tuning, especially in low-resource scenarios. We propose a simple yet effective approach to helping models learn rare biomedical words during tuning with prompt. Experimental results show that our method can achieve up to 6% improvement in biomedical natural language inference task without any extra parameters or training steps using few-shot vanilla prompt settings.
Anthology ID:
2022.coling-1.122
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1422–1431
Language:
URL:
https://aclanthology.org/2022.coling-1.122
DOI:
Bibkey:
Cite (ACL):
Haochun Wang, Chi Liu, Nuwa Xi, Sendong Zhao, Meizhi Ju, Shiwei Zhang, Ziheng Zhang, Yefeng Zheng, Bing Qin, and Ting Liu. 2022. Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand Rare Biomedical Words. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1422–1431, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Prompt Combines Paraphrase: Teaching Pre-trained Models to Understand Rare Biomedical Words (Wang et al., COLING 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.coling-1.122.pdf
Code
 s65b40/prompt_n_paraphrase
Data
MIMIC-III