HLTRI at W-NUT 2020 Shared Task-3: COVID-19 Event Extraction from Twitter Using Multi-Task Hopfield Pooling

Maxwell Weinzierl, Sanda Harabagiu


Abstract
Extracting structured knowledge involving self-reported events related to the COVID-19 pandemic from Twitter has the potential to inform surveillance systems that play a critical role in public health. The event extraction challenge presented by the W-NUT 2020 Shared Task 3 focused on the identification of five types of events relevant to the COVID-19 pandemic and their respective set of pre-defined slots encoding demographic, epidemiological, clinical as well as spatial, temporal or subjective knowledge. Our participation in the challenge led to the design of a neural architecture for jointly identifying all Event Slots expressed in a tweet relevant to an event of interest. This architecture uses COVID-Twitter-BERT as the pre-trained language model. In addition, to learn text span embeddings for each Event Slot, we relied on a special case of Hopfield Networks, namely Hopfield pooling. The results of the shared task evaluation indicate that our system performs best when it is trained on a larger dataset, while it remains competitive when training on smaller datasets.
Anthology ID:
2020.wnut-1.80
Volume:
Proceedings of the Sixth Workshop on Noisy User-generated Text (W-NUT 2020)
Month:
November
Year:
2020
Address:
Online
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
530–538
Language:
URL:
https://aclanthology.org/2020.wnut-1.80
DOI:
10.18653/v1/2020.wnut-1.80
Bibkey:
Cite (ACL):
Maxwell Weinzierl and Sanda Harabagiu. 2020. HLTRI at W-NUT 2020 Shared Task-3: COVID-19 Event Extraction from Twitter Using Multi-Task Hopfield Pooling. In Proceedings of the Sixth Workshop on Noisy User-generated Text (W-NUT 2020), pages 530–538, Online. Association for Computational Linguistics.
Cite (Informal):
HLTRI at W-NUT 2020 Shared Task-3: COVID-19 Event Extraction from Twitter Using Multi-Task Hopfield Pooling (Weinzierl & Harabagiu, WNUT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2020.wnut-1.80.pdf