Incorporating Domain Knowledge into Language Transformers for Multi-Label Classification of Chinese Medical Questions

Po-Han Chen, Yu-Xiang Zeng, Lung-Hao Lee


Abstract
In this paper, we propose a knowledge infusion mechanism to incorporate domain knowledge into language transformers. Weakly supervised data is regarded as the main source for knowledge acquisition. We pre-train the language models to capture masked knowledge of focuses and aspects and then fine-tune them to obtain better performance on the downstream tasks. Due to the lack of publicly available datasets for multi-label classification of Chinese medical questions, we crawled questions from medical question/answer forums and manually annotated them using eight predefined classes: persons and organizations, symptom, cause, examination, disease, information, ingredient, and treatment. Finally, a total of 1,814 questions with 2,340 labels. Each question contains an average of 1.29 labels. We used Baidu Medical Encyclopedia as the knowledge resource. Two transformers BERT and RoBERTa were implemented to compare performance on our constructed datasets. Experimental results showed that our proposed model with knowledge infusion mechanism can achieve better performance, no matter which evaluation metric including Macro F1, Micro F1, Weighted F1 or Subset Accuracy were considered.
Anthology ID:
2021.rocling-1.34
Volume:
Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021)
Month:
October
Year:
2021
Address:
Taoyuan, Taiwan
Editors:
Lung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen
Venue:
ROCLING
SIG:
Publisher:
The Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Note:
Pages:
265–270
Language:
URL:
https://aclanthology.org/2021.rocling-1.34
DOI:
Bibkey:
Cite (ACL):
Po-Han Chen, Yu-Xiang Zeng, and Lung-Hao Lee. 2021. Incorporating Domain Knowledge into Language Transformers for Multi-Label Classification of Chinese Medical Questions. In Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021), pages 265–270, Taoyuan, Taiwan. The Association for Computational Linguistics and Chinese Language Processing (ACLCLP).
Cite (Informal):
Incorporating Domain Knowledge into Language Transformers for Multi-Label Classification of Chinese Medical Questions (Chen et al., ROCLING 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.rocling-1.34.pdf