Zhiqiang Zhan
2018
Adaptive Learning of Local Semantic and Global Structure Representations for Text Classification
Jianyu Zhao
|
Zhiqiang Zhan
|
Qichuan Yang
|
Yang Zhang
|
Changjian Hu
|
Zhensheng Li
|
Liuxin Zhang
|
Zhiqiang He
Proceedings of the 27th International Conference on Computational Linguistics
Representation learning is a key issue for most Natural Language Processing (NLP) tasks. Most existing representation models either learn little structure information or just rely on pre-defined structures, leading to degradation of performance and generalization capability. This paper focuses on learning both local semantic and global structure representations for text classification. In detail, we propose a novel Sandwich Neural Network (SNN) to learn semantic and structure representations automatically without relying on parsers. More importantly, semantic and structure information contribute unequally to the text representation at corpus and instance level. To solve the fusion problem, we propose two strategies: Adaptive Learning Sandwich Neural Network (AL-SNN) and Self-Attention Sandwich Neural Network (SA-SNN). The former learns the weights at corpus level, and the latter further combines attention mechanism to assign the weights at instance level. Experimental results demonstrate that our approach achieves competitive performance on several text classification tasks, including sentiment analysis, question type classification and subjectivity classification. Specifically, the accuracies are MR (82.1%), SST-5 (50.4%), TREC (96%) and SUBJ (93.9%).
Search
Co-authors
- Jianyu Zhao 1
- Qichuan Yang 1
- Yang Zhang 1
- Changjian Hu 1
- Zhensheng Li 1
- show all...