Detecting Sarcasm in Conversation Context Using Transformer-Based Models

Adithya Avvaru, Sanath Vobilisetty, Radhika Mamidi


Abstract
Sarcasm detection, regarded as one of the sub-problems of sentiment analysis, is a very typical task because the introduction of sarcastic words can flip the sentiment of the sentence itself. To date, many research works revolve around detecting sarcasm in one single sentence and there is very limited research to detect sarcasm resulting from multiple sentences. Current models used Long Short Term Memory (LSTM) variants with or without attention to detect sarcasm in conversations. We showed that the models using state-of-the-art Bidirectional Encoder Representations from Transformers (BERT), to capture syntactic and semantic information across conversation sentences, performed better than the current models. Based on the data analysis, we estimated that the number of sentences in the conversation that can contribute to the sarcasm and the results agrees to this estimation. We also perform a comparative study of our different versions of BERT-based model with other variants of LSTM model and XLNet (both using the estimated number of conversation sentences) and find out that BERT-based models outperformed them.
Anthology ID:
2020.figlang-1.15
Volume:
Proceedings of the Second Workshop on Figurative Language Processing
Month:
July
Year:
2020
Address:
Online
Editors:
Beata Beigman Klebanov, Ekaterina Shutova, Patricia Lichtenstein, Smaranda Muresan, Chee Wee, Anna Feldman, Debanjan Ghosh
Venue:
Fig-Lang
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
98–103
Language:
URL:
https://aclanthology.org/2020.figlang-1.15
DOI:
10.18653/v1/2020.figlang-1.15
Bibkey:
Cite (ACL):
Adithya Avvaru, Sanath Vobilisetty, and Radhika Mamidi. 2020. Detecting Sarcasm in Conversation Context Using Transformer-Based Models. In Proceedings of the Second Workshop on Figurative Language Processing, pages 98–103, Online. Association for Computational Linguistics.
Cite (Informal):
Detecting Sarcasm in Conversation Context Using Transformer-Based Models (Avvaru et al., Fig-Lang 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.figlang-1.15.pdf