Sarcasm Detection using Context Separators in Online Discourse

Tanvi Dadu, Kartikey Pant


Abstract
Sarcasm is an intricate form of speech, where meaning is conveyed implicitly. Being a convoluted form of expression, detecting sarcasm is an assiduous problem. The difficulty in recognition of sarcasm has many pitfalls, including misunderstandings in everyday communications, which leads us to an increasing focus on automated sarcasm detection. In the second edition of the Figurative Language Processing (FigLang 2020) workshop, the shared task of sarcasm detection released two datasets, containing responses along with their context sampled from Twitter and Reddit. In this work, we use RoBERTalarge to detect sarcasm in both the datasets. We further assert the importance of context in improving the performance of contextual word embedding based models by using three different types of inputs - Response-only, Context-Response, and Context-Response (Separated). We show that our proposed architecture performs competitively for both the datasets. We also show that the addition of a separation token between context and target response results in an improvement of 5.13% in the F1-score in the Reddit dataset.
Anthology ID:
2020.figlang-1.6
Volume:
Proceedings of the Second Workshop on Figurative Language Processing
Month:
July
Year:
2020
Address:
Online
Venue:
Fig-Lang
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
51–55
Language:
URL:
https://aclanthology.org/2020.figlang-1.6
DOI:
10.18653/v1/2020.figlang-1.6
Bibkey:
Cite (ACL):
Tanvi Dadu and Kartikey Pant. 2020. Sarcasm Detection using Context Separators in Online Discourse. In Proceedings of the Second Workshop on Figurative Language Processing, pages 51–55, Online. Association for Computational Linguistics.
Cite (Informal):
Sarcasm Detection using Context Separators in Online Discourse (Dadu & Pant, Fig-Lang 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.figlang-1.6.pdf
Video:
 http://slideslive.com/38929695