MedWriter: Knowledge-Aware Medical Text Generation

Youcheng Pan, Qingcai Chen, Weihua Peng, Xiaolong Wang, Baotian Hu, Xin Liu, Junying Chen, Wenxiu Zhou


Abstract
To exploit the domain knowledge to guarantee the correctness of generated text has been a hot topic in recent years, especially for high professional domains such as medical. However, most of recent works only consider the information of unstructured text rather than structured information of the knowledge graph. In this paper, we focus on the medical topic-to-text generation task and adapt a knowledge-aware text generation model to the medical domain, named MedWriter, which not only introduces the specific knowledge from the external MKG but also is capable of learning graph-level representation. We conduct experiments on a medical literature dataset collected from medical journals, each of which has a set of topic words, an abstract of medical literature and a corresponding knowledge graph from CMeKG. Experimental results demonstrate incorporating knowledge graph into generation model can improve the quality of the generated text and has robust superiority over the competitor methods.
Anthology ID:
2020.coling-main.214
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2363–2368
Language:
URL:
https://aclanthology.org/2020.coling-main.214
DOI:
10.18653/v1/2020.coling-main.214
Bibkey:
Cite (ACL):
Youcheng Pan, Qingcai Chen, Weihua Peng, Xiaolong Wang, Baotian Hu, Xin Liu, Junying Chen, and Wenxiu Zhou. 2020. MedWriter: Knowledge-Aware Medical Text Generation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2363–2368, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
MedWriter: Knowledge-Aware Medical Text Generation (Pan et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2020.coling-main.214.pdf