2019
pdf
abs
Investigating Dynamic Routing in Tree-Structured LSTM for Sentiment Analysis
Jin Wang
|
Liang-Chih Yu
|
K. Robert Lai
|
Xuejie Zhang
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Deep neural network models such as long short-term memory (LSTM) and tree-LSTM have been proven to be effective for sentiment analysis. However, sequential LSTM is a bias model wherein the words in the tail of a sentence are more heavily emphasized than those in the header for building sentence representations. Even tree-LSTM, with useful structural information, could not avoid the bias problem because the root node will be dominant and the nodes in the bottom of the parse tree will be less emphasized even though they may contain salient information. To overcome the bias problem, this study proposes a capsule tree-LSTM model, introducing a dynamic routing algorithm as an aggregation layer to build sentence representation by assigning different weights to nodes according to their contributions to prediction. Experiments on Stanford Sentiment Treebank (SST) for sentiment classification and EmoBank for regression show that the proposed method improved the performance of tree-LSTM and other neural network models. In addition, the deeper the tree structure, the bigger the improvement.
2017
pdf
abs
Refining Word Embeddings for Sentiment Analysis
Liang-Chih Yu
|
Jin Wang
|
K. Robert Lai
|
Xuejie Zhang
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Word embeddings that can capture semantic and syntactic information from contexts have been extensively used for various natural language processing tasks. However, existing methods for learning context-based word embeddings typically fail to capture sufficient sentiment information. This may result in words with similar vector representations having an opposite sentiment polarity (e.g., good and bad), thus degrading sentiment analysis performance. Therefore, this study proposes a word vector refinement model that can be applied to any pre-trained word vectors (e.g., Word2vec and GloVe). The refinement model is based on adjusting the vector representations of words such that they can be closer to both semantically and sentimentally similar words and further away from sentimentally dissimilar words. Experimental results show that the proposed method can improve conventional word embeddings and outperform previously proposed sentiment embeddings for both binary and fine-grained classification on Stanford Sentiment Treebank (SST).
pdf
abs
YZU-NLP at EmoInt-2017: Determining Emotion Intensity Using a Bi-directional LSTM-CNN Model
Yuanye He
|
Liang-Chih Yu
|
K. Robert Lai
|
Weiyi Liu
Proceedings of the 8th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis
The EmoInt-2017 task aims to determine a continuous numerical value representing the intensity to which an emotion is expressed in a tweet. Compared to classification tasks that identify 1 among n emotions for a tweet, the present task can provide more fine-grained (real-valued) sentiment analysis. This paper presents a system that uses a bi-directional LSTM-CNN model to complete the competition task. Combining bi-directional LSTM and CNN, the prediction process considers both global information in a tweet and local important information. The proposed method ranked sixth among twenty-one teams in terms of Pearson Correlation Coefficient.
pdf
abs
SentiNLP at IJCNLP-2017 Task 4: Customer Feedback Analysis Using a Bi-LSTM-CNN Model
Shuying Lin
|
Huosheng Xie
|
Liang-Chih Yu
|
K. Robert Lai
Proceedings of the IJCNLP 2017, Shared Tasks
The analysis of customer feedback is useful to provide good customer service. There are a lot of online customer feedback are produced. Manual classification is impractical because the high volume of data. Therefore, the automatic classification of the customer feedback is of importance for the analysis system to identify meanings or intentions that the customer express. The aim of shared Task 4 of IJCNLP 2017 is to classify the customer feedback into six tags categorization. In this paper, we present a system that uses word embeddings to express the feature of the sentence in the corpus and the neural network as the classifier to complete the shared task. And then the ensemble method is used to get final predictive result. The proposed method get ranked first among twelve teams in terms of micro-averaged F1 and second for accura-cy metric.
pdf
應用詞向量於語言樣式探勘之研究 (Mining Language Patterns Using Word Embeddings) [In Chinese]
Xiang Xiao
|
Shao-Zhen Ye
|
Liang-Chih Yu
|
K. Robert Lai
Proceedings of the 29th Conference on Computational Linguistics and Speech Processing (ROCLING 2017)
2016
pdf
Dimensional Sentiment Analysis Using a Regional CNN-LSTM Model
Jin Wang
|
Liang-Chih Yu
|
K. Robert Lai
|
Xuejie Zhang
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
pdf
Building Chinese Affective Resources in Valence-Arousal Dimensions
Liang-Chih Yu
|
Lung-Hao Lee
|
Shuai Hao
|
Jin Wang
|
Yunchao He
|
Jun Hu
|
K. Robert Lai
|
Xuejie Zhang
Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
pdf
YZU-NLP Team at SemEval-2016 Task 4: Ordinal Sentiment Classification Using a Recurrent Convolutional Network
Yunchao He
|
Liang-Chih Yu
|
Chin-Sheng Yang
|
K. Robert Lai
|
Weiyi Liu
Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016)
2015
pdf
Predicting Valence-Arousal Ratings of Words Using a Weighted Graph Method
Liang-Chih Yu
|
Jin Wang
|
K. Robert Lai
|
Xue-jie Zhang
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)