Towards Medical Machine Reading Comprehension with Structural Knowledge and Plain Text

Dongfang Li, Baotian Hu, Qingcai Chen, Weihua Peng, Anqi Wang


Abstract
Machine reading comprehension (MRC) has achieved significant progress on the open domain in recent years, mainly due to large-scale pre-trained language models. However, it performs much worse in specific domains such as the medical field due to the lack of extensive training data and professional structural knowledge neglect. As an effort, we first collect a large scale medical multi-choice question dataset (more than 21k instances) for the National Licensed Pharmacist Examination in China. It is a challenging medical examination with a passing rate of less than 14.2% in 2018. Then we propose a novel reading comprehension model KMQA, which can fully exploit the structural medical knowledge (i.e., medical knowledge graph) and the reference medical plain text (i.e., text snippets retrieved from reference books). The experimental results indicate that the KMQA outperforms existing competitive models with a large margin and passes the exam with 61.8% accuracy rate on the test set.
Anthology ID:
2020.emnlp-main.111
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1427–1438
Language:
URL:
https://aclanthology.org/2020.emnlp-main.111
DOI:
10.18653/v1/2020.emnlp-main.111
Bibkey:
Cite (ACL):
Dongfang Li, Baotian Hu, Qingcai Chen, Weihua Peng, and Anqi Wang. 2020. Towards Medical Machine Reading Comprehension with Structural Knowledge and Plain Text. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1427–1438, Online. Association for Computational Linguistics.
Cite (Informal):
Towards Medical Machine Reading Comprehension with Structural Knowledge and Plain Text (Li et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2020.emnlp-main.111.pdf
Optional supplementary material:
 2020.emnlp-main.111.OptionalSupplementaryMaterial.zip