SenteCon: Leveraging Lexicons to Learn Human-Interpretable Language Representations

Victoria Lin, Louis-Philippe Morency


Abstract
Although deep language representations have become the dominant form of language featurization in recent years, in many settings it is important to understand a model’s decision-making process. This necessitates not only an interpretable model but also interpretable features. In particular, language must be featurized in a way that is interpretable while still characterizing the original text well. We present SenteCon, a method for introducing human interpretability in deep language representations. Given a passage of text, SenteCon encodes the text as a layer of interpretable categories in which each dimension corresponds to the relevance of a specific category. Our empirical evaluations indicate that encoding language with SenteCon provides high-level interpretability at little to no cost to predictive performance on downstream tasks. Moreover, we find that SenteCon outperforms existing interpretable language representations with respect to both its downstream performance and its agreement with human characterizations of the text.
Anthology ID:
2023.findings-acl.264
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4312–4331
Language:
URL:
https://aclanthology.org/2023.findings-acl.264
DOI:
10.18653/v1/2023.findings-acl.264
Bibkey:
Cite (ACL):
Victoria Lin and Louis-Philippe Morency. 2023. SenteCon: Leveraging Lexicons to Learn Human-Interpretable Language Representations. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4312–4331, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
SenteCon: Leveraging Lexicons to Learn Human-Interpretable Language Representations (Lin & Morency, Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.264.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2023.findings-acl.264.mp4