Global Locality in Biomedical Relation and Event Extraction

Elaheh ShafieiBavani, Antonio Jimeno Yepes, Xu Zhong, David Martinez Iraola


Abstract
Due to the exponential growth of biomedical literature, event and relation extraction are important tasks in biomedical text mining. Most work only focus on relation extraction, and detect a single entity pair mention on a short span of text, which is not ideal due to long sentences that appear in biomedical contexts. We propose an approach to both relation and event extraction, for simultaneously predicting relationships between all mention pairs in a text. We also perform an empirical study to discuss different network setups for this purpose. The best performing model includes a set of multi-head attentions and convolutions, an adaptation of the transformer architecture, which offers self-attention the ability to strengthen dependencies among related elements, and models the interaction between features extracted by multiple attention heads. Experiment results demonstrate that our approach outperforms the state of the art on a set of benchmark biomedical corpora including BioNLP 2009, 2011, 2013 and BioCreative 2017 shared tasks.
Anthology ID:
2020.bionlp-1.21
Volume:
Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing
Month:
July
Year:
2020
Address:
Online
Editors:
Dina Demner-Fushman, Kevin Bretonnel Cohen, Sophia Ananiadou, Junichi Tsujii
Venue:
BioNLP
SIG:
SIGBIOMED
Publisher:
Association for Computational Linguistics
Note:
Pages:
195–204
Language:
URL:
https://aclanthology.org/2020.bionlp-1.21
DOI:
10.18653/v1/2020.bionlp-1.21
Bibkey:
Cite (ACL):
Elaheh ShafieiBavani, Antonio Jimeno Yepes, Xu Zhong, and David Martinez Iraola. 2020. Global Locality in Biomedical Relation and Event Extraction. In Proceedings of the 19th SIGBioMed Workshop on Biomedical Language Processing, pages 195–204, Online. Association for Computational Linguistics.
Cite (Informal):
Global Locality in Biomedical Relation and Event Extraction (ShafieiBavani et al., BioNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2020.bionlp-1.21.pdf