Huang Yiting


2024

pdf
SynPrompt: Syntax-aware Enhanced Prompt Engineering for Aspect-based Sentiment Analysis
Wen Yin | Cencen Liu | Yi Xu | Ahmad Raza Wahla | Huang Yiting | Dezhang Zheng
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)

Although there have been some works using prompt learning for the Aspect-based Sentiment Analysis(ABSA) tasks, their methods of prompt-tuning are simple and crude. Compared with vanilla fine-tuning methods, prompt learning intuitively bridges the objective form gap between pre-training and fine-tuning. Concretely, simply constructing prompt related to aspect words fails to fully exploit the potential of Pre-trained Language Models, and conducting more robust and professional prompt engineering for downstream tasks is a challenging problem that needs to be solved urgently. Therefore, in this paper, we propose a novel Syntax-aware Enhanced Prompt method (SynPrompt), which sufficiently mines the key syntactic information related to aspect words from the syntactic dependency tree. Additionally, to effectively harness the domain-specific knowledge embedded within PLMs for the ABSA tasks, we construct two adaptive prompt frameworks to enhance the perception ability of the above method. After conducting extensive experiments on three benchmark datasets, we have found that our method consistently achieves favorable results. These findings not only demonstrate the effectiveness and rationality of our proposed methods but also provide a powerful alternative to traditional prompt-tuning.