Zhang Yifei


2021

pdf bib
Multi-level Emotion Cause Analysis by Multi-head Attention Based Multi-task Learning
Li Xiangju | Feng Shi | Zhang Yifei | Wang Daling
Proceedings of the 20th Chinese National Conference on Computational Linguistics

“Emotion cause analysis (ECA) aims to identify the potential causes behind certain emotions intext. Lots of ECA models have been designed to extract the emotion cause at the clause level.However in many scenarios only extracting the cause clause is ambiguous. To ease the problemin this paper we introduce multi-level emotion cause analysis which focuses on identifying emotion cause clause (ECC) and emotion cause keywords (ECK) simultaneously. ECK is a more challenging task since it not only requires capturing the specific understanding of the role of eachword in the clause but also the relation between each word and emotion expression. We observethat ECK task can incorporate the contextual information from the ECC task while ECC taskcan be improved by learning the correlation between emotion cause keywords and emotion fromthe ECK task. To fulfill the goal of joint learning we propose a multi-head attention basedmulti-task learning method which utilizes a series of mechanisms including shared and privatefeature extractor multi-head attention emotion attention and label embedding to capture featuresand correlations between the two tasks. Experimental results show that the proposed method consistently outperforms the state-of-the-art methods on a benchmark emotion cause dataset.”

pdf bib
NS-Hunter: BERT-Cloze Based Semantic Denoising for Distantly Supervised Relation Classification
Shen Tielin | Wang Daling | Feng Shi | Zhang Yifei
Proceedings of the 20th Chinese National Conference on Computational Linguistics

“Distant supervision can generate large-scale relation classification data quickly and economi-cally. However a great number of noise sentences are introduced which can not express their labeled relations. By means of pre-trained language model BERT’s powerful function in this paper we propose a BERT-based semantic denoising approach for distantly supervised relation classification. In detail we define an entity pair as a source entity and a target entity. For the specific sentences whose target entities in BERT-vocabulary (one-token word) we present the differences of dependency between two entities for noise and non-noise sentences. For general sentences whose target entity is multi-token word we further present the differences of last hid-den states of [MASK]-entity (MASK-lhs for short) in BERT for noise and non-noise sentences.We regard the dependency and MASK-lhs in BERT as two semantic features of sentences. With BERT we capture the dependency feature to discriminate specific sentences first then capturethe MASK-lhs feature to denoise distant supervision datasets. We propose NS-Hunter a noveldenoising model which leverages BERT-cloze ability to capture the two semantic features andintegrates above functions. According to the experiment on NYT data our NS-Hunter modelachieves the best results in distant supervision denoising and sentence-level relation classification. Keywords: Distant supervision relation classification semantic denoisingIntroduction”