CLPLM: Character Level Pretrained Language Model for ExtractingSupport Phrases for Sentiment Labels

Raj Pranesh, Sumit Kumar, Ambesh Shekhar


Abstract
In this paper, we have designed a character-level pre-trained language model for extracting support phrases from tweets based on the sentiment label. We also propose a character-level ensemble model designed by properly blending Pre-trained Contextual Embeddings (PCE) models- RoBERTa, BERT, and ALBERT along with Neural network models- RNN, CNN and WaveNet at different stages of the model. For a given tweet and associated sentiment label, our model predicts the span of phrases in a tweet that prompts the particular sentiment in the tweet. In our experiments, we have explored various model architectures and configuration for both single as well as ensemble models. We performed a systematic comparative analysis of all the model’s performance based on the Jaccard score obtained. The best performing ensemble model obtained the highest Jaccard scores of 73.5, giving it a relative improvement of 2.4% over the best performing single RoBERTa based character-level model, at 71.5(Jaccard score).
Anthology ID:
2020.icon-main.64
Volume:
Proceedings of the 17th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2020
Address:
Indian Institute of Technology Patna, Patna, India
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
475–480
Language:
URL:
https://aclanthology.org/2020.icon-main.64
DOI:
Bibkey:
Cite (ACL):
Raj Pranesh, Sumit Kumar, and Ambesh Shekhar. 2020. CLPLM: Character Level Pretrained Language Model for ExtractingSupport Phrases for Sentiment Labels. In Proceedings of the 17th International Conference on Natural Language Processing (ICON), pages 475–480, Indian Institute of Technology Patna, Patna, India. NLP Association of India (NLPAI).
Cite (Informal):
CLPLM: Character Level Pretrained Language Model for ExtractingSupport Phrases for Sentiment Labels (Pranesh et al., ICON 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2020.icon-main.64.pdf