Low Resource Causal Event Detection from Biomedical Literature

Zhengzhong Liang, Enrique Noriega-Atala, Clayton Morrison, Mihai Surdeanu


Abstract
Recognizing causal precedence relations among the chemical interactions in biomedical literature is crucial to understanding the underlying biological mechanisms. However, detecting such causal relation can be hard because: (1) many times, such causal relations among events are not explicitly expressed by certain phrases but implicitly implied by very diverse expressions in the text, and (2) annotating such causal relation detection datasets requires considerable expert knowledge and effort. In this paper, we propose a strategy to address both challenges by training neural models with in-domain pre-training and knowledge distillation. We show that, by using very limited amount of labeled data, and sufficient amount of unlabeled data, the neural models outperform previous baselines on the causal precedence detection task, and are ten times faster at inference compared to the BERT base model.
Anthology ID:
2022.bionlp-1.24
Volume:
Proceedings of the 21st Workshop on Biomedical Language Processing
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Dina Demner-Fushman, Kevin Bretonnel Cohen, Sophia Ananiadou, Junichi Tsujii
Venue:
BioNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
252–263
Language:
URL:
https://aclanthology.org/2022.bionlp-1.24
DOI:
10.18653/v1/2022.bionlp-1.24
Bibkey:
Cite (ACL):
Zhengzhong Liang, Enrique Noriega-Atala, Clayton Morrison, and Mihai Surdeanu. 2022. Low Resource Causal Event Detection from Biomedical Literature. In Proceedings of the 21st Workshop on Biomedical Language Processing, pages 252–263, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Low Resource Causal Event Detection from Biomedical Literature (Liang et al., BioNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.bionlp-1.24.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.bionlp-1.24.mp4