Abstract
Transformer models, trained and publicly released over the last couple of years, have proved effective in many NLP tasks. We wished to test their usefulness in particular on the stance detection task. We performed experiments on the data from the Fake News Challenge Stage 1 (FNC-1). We were indeed able to improve the reported SotA on the challenge, by exploiting the generalization power of large language models based on Transformer architecture. Specifically (1) we improved the FNC-1 best performing model adding BERT sentence embedding of input sequences as a model feature, (2) we fine-tuned BERT, XLNet, and RoBERTa transformers on FNC-1 extended dataset and obtained state-of-the-art results on FNC-1 task.- Anthology ID:
- 2020.lrec-1.152
- Volume:
- Proceedings of the Twelfth Language Resources and Evaluation Conference
- Month:
- May
- Year:
- 2020
- Address:
- Marseille, France
- Editors:
- Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Asuncion Moreno, Jan Odijk, Stelios Piperidis
- Venue:
- LREC
- SIG:
- Publisher:
- European Language Resources Association
- Note:
- Pages:
- 1211–1218
- Language:
- English
- URL:
- https://aclanthology.org/2020.lrec-1.152
- DOI:
- Cite (ACL):
- Valeriya Slovikovskaya and Giuseppe Attardi. 2020. Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 1211–1218, Marseille, France. European Language Resources Association.
- Cite (Informal):
- Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task (Slovikovskaya & Attardi, LREC 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2020.lrec-1.152.pdf
- Data
- GLUE, RACE, WebText