Automatic Detecting for Health-related Twitter Data with BioBERT

Yang Bai, Xiaobing Zhou


Abstract
Social media used for health applications usually contains a large amount of data posted by users, which brings various challenges to NLP, such as spoken language, spelling errors, novel/creative phrases, etc. In this paper, we describe our system submitted to SMM4H 2020: Social Media Mining for Health Applications Shared Task which consists of five sub-tasks. We participate in subtask 1, subtask 2-English, and subtask 5. Our final submitted approach is an ensemble of various fine-tuned transformer-based models. We illustrate that these approaches perform well in imbalanced datasets (For example, the class ratio is 1:10 in subtask 2), but our model performance is not good in extremely imbalanced datasets (For example, the class ratio is 1:400 in subtask 1). Finally, in subtask 1, our result is lower than the average score, in subtask 2-English, our result is higher than the average score, and in subtask 5, our result achieves the highest score. The code is available online.
Anthology ID:
2020.smm4h-1.10
Volume:
Proceedings of the Fifth Social Media Mining for Health Applications Workshop & Shared Task
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Graciela Gonzalez-Hernandez, Ari Z. Klein, Ivan Flores, Davy Weissenbacher, Arjun Magge, Karen O'Connor, Abeed Sarker, Anne-Lyse Minard, Elena Tutubalina, Zulfat Miftahutdinov, Ilseyar Alimova
Venue:
SMM4H
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
63–69
Language:
URL:
https://aclanthology.org/2020.smm4h-1.10
DOI:
Bibkey:
Cite (ACL):
Yang Bai and Xiaobing Zhou. 2020. Automatic Detecting for Health-related Twitter Data with BioBERT. In Proceedings of the Fifth Social Media Mining for Health Applications Workshop & Shared Task, pages 63–69, Barcelona, Spain (Online). Association for Computational Linguistics.
Cite (Informal):
Automatic Detecting for Health-related Twitter Data with BioBERT (Bai & Zhou, SMM4H 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2020.smm4h-1.10.pdf
Data
SMM4H