Simple Hierarchical Multi-Task Neural End-To-End Entity Linking for Biomedical Text

Maciej Wiatrak, Juha Iso-Sipila


Abstract
Recognising and linking entities is a crucial first step to many tasks in biomedical text analysis, such as relation extraction and target identification. Traditionally, biomedical entity linking methods rely heavily on heuristic rules and predefined, often domain-specific features. The features try to capture the properties of entities and complex multi-step architectures to detect, and subsequently link entity mentions. We propose a significant simplification to the biomedical entity linking setup that does not rely on any heuristic methods. The system performs all the steps of the entity linking task jointly in either single or two stages. We explore the use of hierarchical multi-task learning, using mention recognition and entity typing tasks as auxiliary tasks. We show that hierarchical multi-task models consistently outperform single-task models when trained tasks are homogeneous. We evaluate the performance of our models on the biomedical entity linking benchmarks using MedMentions and BC5CDR datasets. We achieve state-of-theart results on the challenging MedMentions dataset, and comparable results on BC5CDR.
Anthology ID:
2020.louhi-1.2
Volume:
Proceedings of the 11th International Workshop on Health Text Mining and Information Analysis
Month:
November
Year:
2020
Address:
Online
Venue:
Louhi
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12–17
Language:
URL:
https://aclanthology.org/2020.louhi-1.2
DOI:
10.18653/v1/2020.louhi-1.2
Bibkey:
Cite (ACL):
Maciej Wiatrak and Juha Iso-Sipila. 2020. Simple Hierarchical Multi-Task Neural End-To-End Entity Linking for Biomedical Text. In Proceedings of the 11th International Workshop on Health Text Mining and Information Analysis, pages 12–17, Online. Association for Computational Linguistics.
Cite (Informal):
Simple Hierarchical Multi-Task Neural End-To-End Entity Linking for Biomedical Text (Wiatrak & Iso-Sipila, Louhi 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2020.louhi-1.2.pdf
Video:
 https://slideslive.com/38940048