Yi-Shin Chen


2022

pdf
Unsupervised Text Summarization of Long Documents using Dependency-based Noun Phrases and Contextual Order Arrangement
Yen-Hao Huang | Hsiao-Yen Lan | Yi-Shin Chen
Proceedings of the 34th Conference on Computational Linguistics and Speech Processing (ROCLING 2022)

Unsupervised extractive summarization has recently gained importance since it does not require labeled data. Among unsupervised methods, graph-based approaches have achieved outstanding results. These methods represent each document by a graph, with sentences as nodes and word-level similarity among sentences as edges. Common words can easily lead to a strong connection between sentence nodes. Thus, sentences with many common words can be misinterpreted as salient sentences for a summary. This work addresses the common word issue with a phrase-level graph that (1) focuses on the noun phrases of a document based on grammar dependencies and (2) initializes edge weights by term-frequency within the target document and inverse document frequency over the entire corpus. The importance scores of noun phrases extracted from the graph are then used to select the most salient sentences. To preserve summary coherence, the order of the selected sentences is re-arranged by a flow-aware orderBERT. The results reveal that our unsupervised framework outperformed other extractive methods on ROUGE as well as two human evaluations for semantic similarity and summary coherence.

pdf
ConTextING: Granting Document-Wise Contextual Embeddings to Graph Neural Networks for Inductive Text Classification
Yen-Hao Huang | Yi-Hsin Chen | Yi-Shin Chen
Proceedings of the 29th International Conference on Computational Linguistics

Graph neural networks (GNNs) have been recently applied in natural language processing. Various GNN research studies are proposed to learn node interactions within the local graph of each document that contains words, sentences, or topics for inductive text classification. However, most inductive GNNs that are built on a word graph generally take global word embeddings as node features, without referring to document-wise contextual information. Consequently, we find that BERT models can perform better than inductive GNNs. An intuitive follow-up approach is used to enrich GNNs with contextual embeddings from BERT, yet there is a lack of related research. In this work, we propose a simple yet effective unified model, coined ConTextING, with a joint training mechanism to learn from both document embeddings and contextual word interactions simultaneously. Our experiments show that ConTextING outperforms pure inductive GNNs and BERT-style models. The analyses also highlight the benefits of the sub-word graph and joint training with separated classifiers.

2021

pdf
Unsupervised Multi-document Summarization for News Corpus with Key Synonyms and Contextual Embeddings
Yen-Hao Huang | Ratana Pornvattanavichai | Fernando Henrique Calderon Alvarado | Yi-Shin Chen
Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021)

Information overload has been one of the challenges regarding information from the Internet. It is not a matter of information access, instead, the focus had shifted towards the quality of the retrieved data. Particularly in the news domain, multiple outlets report on the same news events but may differ in details. This work considers that different news outlets are more likely to differ in their writing styles and the choice of words, and proposes a method to extract sentences based on their key information by focusing on the shared synonyms in each sentence. Our method also attempts to reduce redundancy through hierarchical clustering and arrange selected sentences on the proposed orderBERT. The results show that the proposed unsupervised framework successfully improves the coverage, coherence, and, meanwhile, reduces the redundancy for a generated summary. Moreover, due to the process of obtaining the dataset, we also propose a data refinement method to alleviate the problems of undesirable texts, which result from the process of automatic scraping.

2019

pdf bib
Discovering the Latent Writing Style from Articles: A Contextualized Feature Extraction Approach
Yen-Hao Huang | Ting-Wei Liu | Ssu-Rui Lee | Ya-Wen Yu | Wan-Hsuan Lee | Fernando Henrique Calderon Alvarado | Yi-Shin Chen
International Journal of Computational Linguistics & Chinese Language Processing, Volume 24, Number 1, June 2019

2018

pdf
CARER: Contextualized Affect Representations for Emotion Recognition
Elvis Saravia | Hsien-Chi Toby Liu | Yen-Hao Huang | Junlin Wu | Yi-Shin Chen
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing

Emotions are expressed in nuanced ways, which varies by collective or individual experiences, knowledge, and beliefs. Therefore, to understand emotion, as conveyed through text, a robust mechanism capable of capturing and modeling different linguistic nuances and phenomena is needed. We propose a semi-supervised, graph-based algorithm to produce rich structural descriptors which serve as the building blocks for constructing contextualized affect representations from text. The pattern-based representations are further enriched with word embeddings and evaluated through several emotion recognition tasks. Our experimental results demonstrate that the proposed method outperforms state-of-the-art techniques on emotion recognition tasks.

2014

pdf
Multi-Lingual Sentiment Analysis of Social Data Based on Emotion-Bearing Patterns
Carlos Argueta | Yi-Shin Chen
Proceedings of the Second Workshop on Natural Language Processing for Social Media (SocialNLP)

pdf
Collaborative Ranking between Supervised and Unsupervised Approaches for Keyphrase Extraction
Gerardo Figueroa | Yi-Shin Chen
Proceedings of the 26th Conference on Computational Linguistics and Speech Processing (ROCLING 2014)