View Distillation with Unlabeled Data for Extracting Adverse Drug Effects from User-Generated Data

Payam Karisani, Jinho D. Choi, Li Xiong


Abstract
We present an algorithm based on multi-layer transformers for identifying Adverse Drug Reactions (ADR) in social media data. Our model relies on the properties of the problem and the characteristics of contextual word embeddings to extract two views from documents. Then a classifier is trained on each view to label a set of unlabeled documents to be used as an initializer for a new classifier in the other view. Finally, the initialized classifier in each view is further trained using the initial training examples. We evaluated our model in the largest publicly available ADR dataset. The experiments testify that our model significantly outperforms the transformer-based models pretrained on domain-specific data.
Anthology ID:
2021.smm4h-1.2
Volume:
Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task
Month:
June
Year:
2021
Address:
Mexico City, Mexico
Editors:
Arjun Magge, Ari Klein, Antonio Miranda-Escalada, Mohammed Ali Al-garadi, Ilseyar Alimova, Zulfat Miftahutdinov, Eulalia Farre-Maduell, Salvador Lima Lopez, Ivan Flores, Karen O'Connor, Davy Weissenbacher, Elena Tutubalina, Abeed Sarker, Juan M Banda, Martin Krallinger, Graciela Gonzalez-Hernandez
Venue:
SMM4H
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7–12
Language:
URL:
https://aclanthology.org/2021.smm4h-1.2
DOI:
10.18653/v1/2021.smm4h-1.2
Bibkey:
Cite (ACL):
Payam Karisani, Jinho D. Choi, and Li Xiong. 2021. View Distillation with Unlabeled Data for Extracting Adverse Drug Effects from User-Generated Data. In Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task, pages 7–12, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
View Distillation with Unlabeled Data for Extracting Adverse Drug Effects from User-Generated Data (Karisani et al., SMM4H 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2021.smm4h-1.2.pdf