Fancy Man Launches Zippo at WNUT 2020 Shared Task-1: A Bert Case Model for Wet Lab Entity Extraction

Qingcheng Zeng, Xiaoyang Fang, Zhexin Liang, Haoding Meng


Abstract
Automatic or semi-automatic conversion of protocols specifying steps in performing a lab procedure into machine-readable format benefits biological research a lot. These noisy, dense, and domain-specific lab protocols processing draws more and more interests with the development of deep learning. This paper presents our teamwork on WNUT 2020 shared task-1: wet lab entity extract, that we conducted studies in several models, including a BiLSTM CRF model and a Bert case model which can be used to complete wet lab entity extraction. And we mainly discussed the performance differences of Bert case under different situations such as transformers versions, case sensitivity that may don’t get enough attention before.
Anthology ID:
2020.wnut-1.39
Volume:
Proceedings of the Sixth Workshop on Noisy User-generated Text (W-NUT 2020)
Month:
November
Year:
2020
Address:
Online
Editors:
Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
299–304
Language:
URL:
https://aclanthology.org/2020.wnut-1.39
DOI:
10.18653/v1/2020.wnut-1.39
Bibkey:
Cite (ACL):
Qingcheng Zeng, Xiaoyang Fang, Zhexin Liang, and Haoding Meng. 2020. Fancy Man Launches Zippo at WNUT 2020 Shared Task-1: A Bert Case Model for Wet Lab Entity Extraction. In Proceedings of the Sixth Workshop on Noisy User-generated Text (W-NUT 2020), pages 299–304, Online. Association for Computational Linguistics.
Cite (Informal):
Fancy Man Launches Zippo at WNUT 2020 Shared Task-1: A Bert Case Model for Wet Lab Entity Extraction (Zeng et al., WNUT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2020.wnut-1.39.pdf