Nianzu Ma


2022

pdf
Semantic Novelty Detection and Characterization in Factual Text Involving Named Entities
Nianzu Ma | Sahisnu Mazumder | Alexander Politowicz | Bing Liu | Eric Robertson | Scott Grigsby
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing

Much of the existing work on text novelty detection has been studied at the topic level, i.e., identifying whether the topic of a document or a sentence is novel or not. Little work has been done at the fine-grained semantic level (or contextual level). For example, given that we know Elon Musk is the CEO of a technology company, the sentence “Elon Musk acted in the sitcom The Big Bang Theory” is novel and surprising because normally a CEO would not be an actor. Existing topic-based novelty detection methods work poorly on this problem because they do not perform semantic reasoning involving relations between named entities in the text and their background knowledge. This paper proposes an effective model (called PAT-SND) to solve the problem, which can also characterize the novelty. An annotated dataset is also created. Evaluation shows that PAT-SND outperforms 10 baselines by large margins.

2021

pdf
Semantic Novelty Detection in Natural Language Descriptions
Nianzu Ma | Alexander Politowicz | Sahisnu Mazumder | Jiahua Chen | Bing Liu | Eric Robertson | Scott Grigsby
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

This paper proposes to study a fine-grained semantic novelty detection task, which can be illustrated with the following example. It is normal that a person walks a dog in the park, but if someone says “A man is walking a chicken in the park”, it is novel. Given a set of natural language descriptions of normal scenes, we want to identify descriptions of novel scenes. We are not aware of any existing work that solves the problem. Although existing novelty or anomaly detection algorithms are applicable, since they are usually topic-based, they perform poorly on our fine-grained semantic novelty detection task. This paper proposes an effective model (called GAT-MA) to solve the problem and also contributes a new dataset. Experimental evaluation shows that GAT-MA outperforms 11 baselines by large margins.

2020

pdf
Entity-Aware Dependency-Based Deep Graph Attention Network for Comparative Preference Classification
Nianzu Ma | Sahisnu Mazumder | Hao Wang | Bing Liu
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

This paper studies the task of comparative preference classification (CPC). Given two entities in a sentence, our goal is to classify whether the first (or the second) entity is preferred over the other or no comparison is expressed at all between the two entities. Existing works either do not learn entity-aware representations well and fail to deal with sentences involving multiple entity pairs or use sequential modeling approaches that are unable to capture long-range dependencies between the entities. Some also use traditional machine learning approaches that do not generalize well. This paper proposes a novel Entity-aware Dependency-based Deep Graph Attention Network (ED-GAT) that employs a multi-hop graph attention over a dependency graph sentence representation to leverage both the semantic information from word embeddings and the syntactic information from the dependency graph to solve the problem. Empirical evaluation shows that the proposed model achieves the state-of-the-art performance in comparative preference classification.

2019

pdf
Lifelong and Interactive Learning of Factual Knowledge in Dialogues
Sahisnu Mazumder | Bing Liu | Shuai Wang | Nianzu Ma
Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue

Dialogue systems are increasingly using knowledge bases (KBs) storing real-world facts to help generate quality responses. However, as the KBs are inherently incomplete and remain fixed during conversation, it limits dialogue systems’ ability to answer questions and to handle questions involving entities or relations that are not in the KB. In this paper, we make an attempt to propose an engine for Continuous and Interactive Learning of Knowledge (CILK) for dialogue systems to give them the ability to continuously and interactively learn and infer new knowledge during conversations. With more knowledge accumulated over time, they will be able to learn better and answer more questions. Our empirical evaluation shows that CILK is promising.

2015

pdf
Lifelong Learning for Sentiment Classification
Zhiyuan Chen | Nianzu Ma | Bing Liu
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)