Konwledge-Enabled Diagnosis Assistant Based on Obstetric EMRs and Knowledge Graph

Kunli Zhang, Xu Zhao, Lei Zhuang, Qi Xie, Hongying Zan


Abstract
The obstetric Electronic Medical Record (EMR) contains a large amount of medical data and health information. It plays a vital role in improving the quality of the diagnosis assistant service. In this paper, we treat the diagnosis assistant as a multi-label classification task and propose a Knowledge-Enabled Diagnosis Assistant (KEDA) model for the obstetric diagnosis assistant. We utilize the numerical information in EMRs and the external knowledge from Chinese Obstetric Knowledge Graph (COKG) to enhance the text representation of EMRs. Specifically, the bidirectional maximum matching method and similarity-based approach are used to obtain the entities set contained in EMRs and linked to the COKG. The final knowledge representation is obtained by a weight-based disease prediction algorithm, and it is fused with the text representation through a linear weighting method. Experiment results show that our approach can bring about +3.53 F1 score improvements upon the strong BERT baseline in the diagnosis assistant task.
Anthology ID:
2020.ccl-1.107
Volume:
Proceedings of the 19th Chinese National Conference on Computational Linguistics
Month:
October
Year:
2020
Address:
Haikou, China
Editors:
Maosong Sun (孙茂松), Sujian Li (李素建), Yue Zhang (张岳), Yang Liu (刘洋)
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
1155–1165
Language:
English
URL:
https://aclanthology.org/2020.ccl-1.107
DOI:
Bibkey:
Cite (ACL):
Kunli Zhang, Xu Zhao, Lei Zhuang, Qi Xie, and Hongying Zan. 2020. Konwledge-Enabled Diagnosis Assistant Based on Obstetric EMRs and Knowledge Graph. In Proceedings of the 19th Chinese National Conference on Computational Linguistics, pages 1155–1165, Haikou, China. Chinese Information Processing Society of China.
Cite (Informal):
Konwledge-Enabled Diagnosis Assistant Based on Obstetric EMRs and Knowledge Graph (Zhang et al., CCL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2020.ccl-1.107.pdf