Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture

Ashok Thillaisundaram, Theodosia Togia


Abstract
This paper presents our participation in the AGAC Track from the 2019 BioNLP Open Shared Tasks. We provide a solution for Task 3, which aims to extract “gene - function change - disease” triples, where “gene” and “disease” are mentions of particular genes and diseases respectively and “function change” is one of four pre-defined relationship types. Our system extends BERT (Devlin et al., 2018), a state-of-the-art language model, which learns contextual language representations from a large unlabelled corpus and whose parameters can be fine-tuned to solve specific tasks with minimal additional architecture. We encode the pair of mentions and their textual context as two consecutive sequences in BERT, separated by a special symbol. We then use a single linear layer to classify their relationship into five classes (four pre-defined, as well as ‘no relation’). Despite considerable class imbalance, our system significantly outperforms a random baseline while relying on an extremely simple setup with no specially engineered features.
Anthology ID:
D19-5713
Volume:
Proceedings of the 5th Workshop on BioNLP Open Shared Tasks
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kim Jin-Dong, Nédellec Claire, Bossy Robert, Deléger Louise
Venue:
BioNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
84–89
Language:
URL:
https://aclanthology.org/D19-5713
DOI:
10.18653/v1/D19-5713
Bibkey:
Cite (ACL):
Ashok Thillaisundaram and Theodosia Togia. 2019. Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture. In Proceedings of the 5th Workshop on BioNLP Open Shared Tasks, pages 84–89, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture (Thillaisundaram & Togia, BioNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/D19-5713.pdf