Abstract
The explosive growth and popularity of Social Media has revolutionised the way we communicate and collaborate. Unfortunately, this same ease of accessing and sharing information has led to an explosion of misinformation and propaganda. Given that stance detection can significantly aid in veracity prediction, this work focuses on boosting automated stance detection, a task on which pre-trained models have been extremely successful on, as on several other tasks. This work shows that the task of stance detection can benefit from feature based information, especially on certain under performing classes, however, integrating such features into pre-trained models using ensembling is challenging. We propose a novel architecture for integrating features with pre-trained models that address these challenges and test our method on the RumourEval 2019 dataset. This method achieves state-of-the-art results with an F1-score of 63.94 on the test set.- Anthology ID:
- 2020.nlp4if-1.3
- Volume:
- Proceedings of the 3rd NLP4IF Workshop on NLP for Internet Freedom: Censorship, Disinformation, and Propaganda
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Venue:
- NLP4IF
- SIG:
- Publisher:
- International Committee on Computational Linguistics (ICCL)
- Note:
- Pages:
- 22–32
- Language:
- URL:
- https://aclanthology.org/2020.nlp4if-1.3
- DOI:
- Cite (ACL):
- Anushka Prakash and Harish Tayyar Madabushi. 2020. Incorporating Count-Based Features into Pre-Trained Models for Improved Stance Detection. In Proceedings of the 3rd NLP4IF Workshop on NLP for Internet Freedom: Censorship, Disinformation, and Propaganda, pages 22–32, Barcelona, Spain (Online). International Committee on Computational Linguistics (ICCL).
- Cite (Informal):
- Incorporating Count-Based Features into Pre-Trained Models for Improved Stance Detection (Prakash & Tayyar Madabushi, NLP4IF 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.nlp4if-1.3.pdf
- Code
- Anushka-Prakash/RumourEval-2019-Stance-Detection