Multi-level Contrastive Learning for Script-based Character Understanding

Dawei Li, Hengyuan Zhang, Yanran Li, Shiping Yang


Abstract
In this work, we tackle the scenario of understanding characters in scripts, which aims to learn the characters’ personalities and identities from their utterances. We begin by analyzing several challenges in this scenario, and then propose a multi-level contrastive learning framework to capture characters’ global information in a fine-grained manner. To validate the proposed framework, we conduct extensive experiments on three character understanding sub-tasks by comparing with strong pre-trained language models, including SpanBERT, Longformer, BigBird and ChatGPT-3.5. Experimental results demonstrate that our method improves the performances by a considerable margin. Through further in-depth analysis, we show the effectiveness of our method in addressing the challenges and provide more hints on the scenario of character understanding. We will open-source our work in this URL.
Anthology ID:
2023.emnlp-main.366
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5995–6013
Language:
URL:
https://aclanthology.org/2023.emnlp-main.366
DOI:
10.18653/v1/2023.emnlp-main.366
Bibkey:
Cite (ACL):
Dawei Li, Hengyuan Zhang, Yanran Li, and Shiping Yang. 2023. Multi-level Contrastive Learning for Script-based Character Understanding. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 5995–6013, Singapore. Association for Computational Linguistics.
Cite (Informal):
Multi-level Contrastive Learning for Script-based Character Understanding (Li et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.emnlp-main.366.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2023.emnlp-main.366.mp4