Fanyu Meng
2020
Adversarial Semantic Decoupling for Recognizing Open-Vocabulary Slots
Yuanmeng Yan
|
Keqing He
|
Hong Xu
|
Sihong Liu
|
Fanyu Meng
|
Min Hu
|
Weiran Xu
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Open-vocabulary slots, such as file name, album name, or schedule title, significantly degrade the performance of neural-based slot filling models since these slots can take on values from a virtually unlimited set and have no semantic restriction nor a length limit. In this paper, we propose a robust adversarial model-agnostic slot filling method that explicitly decouples local semantics inherent in open-vocabulary slot words from the global context. We aim to depart entangled contextual semantics and focus more on the holistic context at the level of the whole sentence. Experiments on two public datasets show that our method consistently outperforms other methods with a statistically significant margin on all the open-vocabulary slots without deteriorating the performance of normal slots.
A structure-enhanced graph convolutional network for sentiment analysis
Fanyu Meng
|
Junlan Feng
|
Danping Yin
|
Si Chen
|
Min Hu
Findings of the Association for Computational Linguistics: EMNLP 2020
Syntactic information is essential for both sentiment analysis(SA) and aspect-based sentiment analysis(ABSA). Previous work has already achieved great progress utilizing Graph Convolutional Network(GCN) over dependency tree of a sentence. However, these models do not fully exploit the syntactic information obtained from dependency parsing such as the diversified types of dependency relations. The message passing process of GCN should be distinguished based on these syntactic information. To tackle this problem, we design a novel weighted graph convolutional network(WGCN) which can exploit rich syntactic information based on the feature combination. Furthermore, we utilize BERT instead of Bi-LSTM to generate contextualized representations as inputs for GCN and present an alignment method to keep word-level dependencies consistent with wordpiece unit of BERT. With our proposal, we are able to improve the state-of-the-art on four ABSA tasks out of six and two SA tasks out of three.
Search
Co-authors
- Min Hu 2
- Yuanmeng Yan 1
- Keqing He 1
- Hong Xu 1
- Sihong Liu 1
- show all...