Nianzu Ma


2021

pdf
Semantic Novelty Detection in Natural Language Descriptions
Nianzu Ma | Alexander Politowicz | Sahisnu Mazumder | Jiahua Chen | Bing Liu | Eric Robertson | Scott Grigsby
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

This paper proposes to study a fine-grained semantic novelty detection task, which can be illustrated with the following example. It is normal that a person walks a dog in the park, but if someone says “A man is walking a chicken in the park”, it is novel. Given a set of natural language descriptions of normal scenes, we want to identify descriptions of novel scenes. We are not aware of any existing work that solves the problem. Although existing novelty or anomaly detection algorithms are applicable, since they are usually topic-based, they perform poorly on our fine-grained semantic novelty detection task. This paper proposes an effective model (called GAT-MA) to solve the problem and also contributes a new dataset. Experimental evaluation shows that GAT-MA outperforms 11 baselines by large margins.

2020

pdf
Entity-Aware Dependency-Based Deep Graph Attention Network for Comparative Preference Classification
Nianzu Ma | Sahisnu Mazumder | Hao Wang | Bing Liu
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

This paper studies the task of comparative preference classification (CPC). Given two entities in a sentence, our goal is to classify whether the first (or the second) entity is preferred over the other or no comparison is expressed at all between the two entities. Existing works either do not learn entity-aware representations well and fail to deal with sentences involving multiple entity pairs or use sequential modeling approaches that are unable to capture long-range dependencies between the entities. Some also use traditional machine learning approaches that do not generalize well. This paper proposes a novel Entity-aware Dependency-based Deep Graph Attention Network (ED-GAT) that employs a multi-hop graph attention over a dependency graph sentence representation to leverage both the semantic information from word embeddings and the syntactic information from the dependency graph to solve the problem. Empirical evaluation shows that the proposed model achieves the state-of-the-art performance in comparative preference classification.

2019

pdf
Lifelong and Interactive Learning of Factual Knowledge in Dialogues
Sahisnu Mazumder | Bing Liu | Shuai Wang | Nianzu Ma
Proceedings of the 20th Annual SIGdial Meeting on Discourse and Dialogue

Dialogue systems are increasingly using knowledge bases (KBs) storing real-world facts to help generate quality responses. However, as the KBs are inherently incomplete and remain fixed during conversation, it limits dialogue systems’ ability to answer questions and to handle questions involving entities or relations that are not in the KB. In this paper, we make an attempt to propose an engine for Continuous and Interactive Learning of Knowledge (CILK) for dialogue systems to give them the ability to continuously and interactively learn and infer new knowledge during conversations. With more knowledge accumulated over time, they will be able to learn better and answer more questions. Our empirical evaluation shows that CILK is promising.

2015

pdf
Lifelong Learning for Sentiment Classification
Zhiyuan Chen | Nianzu Ma | Bing Liu
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)