MojiTalk: Generating Emotional Responses at Scale

Xianda Zhou, William Yang Wang


Abstract
Generating emotional language is a key step towards building empathetic natural language processing agents. However, a major challenge for this line of research is the lack of large-scale labeled training data, and previous studies are limited to only small sets of human annotated sentiment labels. Additionally, explicitly controlling the emotion and sentiment of generated text is also difficult. In this paper, we take a more radical approach: we exploit the idea of leveraging Twitter data that are naturally labeled with emojis. We collect a large corpus of Twitter conversations that include emojis in the response and assume the emojis convey the underlying emotions of the sentence. We investigate several conditional variational autoencoders training on these conversations, which allow us to use emojis to control the emotion of the generated text. Experimentally, we show in our quantitative and qualitative analyses that the proposed models can successfully generate high-quality abstractive conversation responses in accordance with designated emotions.
Anthology ID:
P18-1104
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1128–1137
Language:
URL:
https://aclanthology.org/P18-1104
DOI:
10.18653/v1/P18-1104
Bibkey:
Cite (ACL):
Xianda Zhou and William Yang Wang. 2018. MojiTalk: Generating Emotional Responses at Scale. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1128–1137, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
MojiTalk: Generating Emotional Responses at Scale (Zhou & Wang, ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/P18-1104.pdf
Note:
 P18-1104.Notes.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/P18-1104.mp4
Code
 additional community code