LLM-Driven Knowledge Injection Advances Zero-Shot and Cross-Target Stance Detection

Zhao Zhang, Yiming Li, Jin Zhang, Hui Xu


Abstract
Stance detection aims at inferring an author’s attitude towards a specific target in a text. Prior methods mainly consider target-related background information for a better understanding of targets while neglecting the accompanying input texts. In this study, we propose to prompt Large Language Models (LLMs) to explicitly extract the relationship between paired text and target as contextual knowledge. We then inject such LLM-driven knowledge into a generation model BART to exploit the rich contexts and semantics. Moreover, to further enhance the decoding capability of BART, a novel prototypical contrastive scheme is designed to align input contents with stance labels. Our experimental results demonstrate the state-of-the-art performance across several publicly available datasets, showcasing effectiveness in both zero-shot and cross-target stance detection scenarios. We publicly release our code to facilitate future research.
Anthology ID:
2024.naacl-short.32
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
371–378
Language:
URL:
https://aclanthology.org/2024.naacl-short.32
DOI:
Bibkey:
Cite (ACL):
Zhao Zhang, Yiming Li, Jin Zhang, and Hui Xu. 2024. LLM-Driven Knowledge Injection Advances Zero-Shot and Cross-Target Stance Detection. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 2: Short Papers), pages 371–378, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
LLM-Driven Knowledge Injection Advances Zero-Shot and Cross-Target Stance Detection (Zhang et al., NAACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-checklist/2024.naacl-short.32.pdf