On learning and representing social meaning in NLP: a sociolinguistic perspective

Dong Nguyen, Laura Rosseel, Jack Grieve


Abstract
The field of NLP has made substantial progress in building meaning representations. However, an important aspect of linguistic meaning, social meaning, has been largely overlooked. We introduce the concept of social meaning to NLP and discuss how insights from sociolinguistics can inform work on representation learning in NLP. We also identify key challenges for this new line of research.
Anthology ID:
2021.naacl-main.50
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
603–612
Language:
URL:
https://aclanthology.org/2021.naacl-main.50
DOI:
10.18653/v1/2021.naacl-main.50
Bibkey:
Cite (ACL):
Dong Nguyen, Laura Rosseel, and Jack Grieve. 2021. On learning and representing social meaning in NLP: a sociolinguistic perspective. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 603–612, Online. Association for Computational Linguistics.
Cite (Informal):
On learning and representing social meaning in NLP: a sociolinguistic perspective (Nguyen et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2021.naacl-main.50.pdf
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2021.naacl-main.50.mp4