2022
pdf
abs
A Meta-framework for Spatiotemporal Quantity Extraction from Text
Qiang Ning
|
Ben Zhou
|
Hao Wu
|
Haoruo Peng
|
Chuchu Fan
|
Matt Gardner
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
News events are often associated with quantities (e.g., the number of COVID-19 patients or the number of arrests in a protest), and it is often important to extract their type, time, and location from unstructured text in order to analyze these quantity events. This paper thus formulates the NLP problem of spatiotemporal quantity extraction, and proposes the first meta-framework for solving it. This meta-framework contains a formalism that decomposes the problem into several information extraction tasks, a shareable crowdsourcing pipeline, and transformer-based baseline models. We demonstrate the meta-framework in three domains—the COVID-19 pandemic, Black Lives Matter protests, and 2020 California wildfires—to show that the formalism is general and extensible, the crowdsourcing pipeline facilitates fast and high-quality data annotation, and the baseline system can handle spatiotemporal quantity extraction well enough to be practically useful. We release all resources for future research on this topic at
https://github.com/steqe.
2019
pdf
abs
KnowSemLM: A Knowledge Infused Semantic Language Model
Haoruo Peng
|
Qiang Ning
|
Dan Roth
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Story understanding requires developing expectations of what events come next in text. Prior knowledge – both statistical and declarative – is essential in guiding such expectations. While existing semantic language models (SemLM) capture event co-occurrence information by modeling event sequences as semantic frames, entities, and other semantic units, this paper aims at augmenting them with causal knowledge (i.e., one event is likely to lead to another). Such knowledge is modeled at the frame and entity level, and can be obtained either statistically from text or stated declaratively. The proposed method, KnowSemLM, infuses this knowledge into a semantic LM by joint training and inference, and is shown to be effective on both the event cloze test and story/referent prediction tasks.
2018
pdf
abs
CogCompTime: A Tool for Understanding Time in Natural Language
Qiang Ning
|
Ben Zhou
|
Zhili Feng
|
Haoruo Peng
|
Dan Roth
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Automatic extraction of temporal information is important for natural language understanding. It involves two basic tasks: (1) Understanding time expressions that are mentioned explicitly in text (e.g., February 27, 1998 or tomorrow), and (2) Understanding temporal information that is conveyed implicitly via relations. This paper introduces CogCompTime, a system that has these two important functionalities. It incorporates the most recent progress, achieves state-of-the-art performance, and is publicly available at
http://cogcomp.org/page/publication_view/844.
pdf
abs
Improving Temporal Relation Extraction with a Globally Acquired Statistical Resource
Qiang Ning
|
Hao Wu
|
Haoruo Peng
|
Dan Roth
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Extracting temporal relations (before, after, overlapping, etc.) is a key aspect of understanding events described in natural language. We argue that this task would gain from the availability of a resource that provides prior knowledge in the form of the temporal order that events usually follow. This paper develops such a resource – a probabilistic knowledge base acquired in the news domain – by extracting temporal relations between events from the New York Times (NYT) articles over a 20-year span (1987–2007). We show that existing temporal extraction systems can be improved via this resource. As a byproduct, we also show that interesting statistics can be retrieved from this resource, which can potentially benefit other time-aware tasks. The proposed system and resource are both publicly available.
2017
pdf
abs
A Joint Model for Semantic Sequences: Frames, Entities, Sentiments
Haoruo Peng
|
Snigdha Chaturvedi
|
Dan Roth
Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)
Understanding stories – sequences of events – is a crucial yet challenging natural language understanding task. These events typically carry multiple aspects of semantics including actions, entities and emotions. Not only does each individual aspect contribute to the meaning of the story, so does the interaction among these aspects. Building on this intuition, we propose to jointly model important aspects of semantic knowledge – frames, entities and sentiments – via a semantic language model. We achieve this by first representing these aspects’ semantic units at an appropriate level of abstraction and then using the resulting vector representations for each semantic aspect to learn a joint representation via a neural language model. We show that the joint semantic language model is of high quality and can generate better semantic sequences than models that operate on the word level. We further demonstrate that our joint model can be applied to story cloze test and shallow discourse parsing tasks with improved performance and that each semantic aspect contributes to the model.
pdf
abs
Story Comprehension for Predicting What Happens Next
Snigdha Chaturvedi
|
Haoruo Peng
|
Dan Roth
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Automatic story comprehension is a fundamental challenge in Natural Language Understanding, and can enable computers to learn about social norms, human behavior and commonsense. In this paper, we present a story comprehension model that explores three distinct semantic aspects: (i) the sequence of events described in the story, (ii) its emotional trajectory, and (iii) its plot consistency. We judge the model’s understanding of real-world stories by inquiring if, like humans, it can develop an expectation of what will happen next in a given story. Specifically, we use it to predict the correct ending of a given short story from possible alternatives. The model uses a hidden variable to weigh the semantic aspects in the context of the story. Our experiments demonstrate the potential of our approach to characterize these semantic aspects, and the strength of the hidden variable based approach. The model outperforms the state-of-the-art approaches and achieves best results on a publicly available dataset.
pdf
abs
Maximum Margin Reward Networks for Learning from Explicit and Implicit Supervision
Haoruo Peng
|
Ming-Wei Chang
|
Wen-tau Yih
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Neural networks have achieved state-of-the-art performance on several structured-output prediction tasks, trained in a fully supervised fashion. However, annotated examples in structured domains are often costly to obtain, which thus limits the applications of neural networks. In this work, we propose Maximum Margin Reward Networks, a neural network-based framework that aims to learn from both explicit (full structures) and implicit supervision signals (delayed feedback on the correctness of the predicted structure). On named entity recognition and semantic parsing, our model outperforms previous systems on the benchmark datasets, CoNLL-2003 and WebQuestionsSP.
2016
pdf
Two Discourse Driven Language Models for Semantics
Haoruo Peng
|
Dan Roth
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
pdf
Event Detection and Co-reference with Minimal Supervision
Haoruo Peng
|
Yangqiu Song
|
Dan Roth
Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing
2015
pdf
Solving Hard Coreference Problems
Haoruo Peng
|
Daniel Khashabi
|
Dan Roth
Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
pdf
The University of Illinois submission to the WMT 2015 Shared Translation Task
Lane Schwartz
|
Bill Bryce
|
Chase Geigle
|
Sean Massung
|
Yisi Liu
|
Haoruo Peng
|
Vignesh Raja
|
Subhro Roy
|
Shyam Upadhyay
Proceedings of the Tenth Workshop on Statistical Machine Translation
pdf
bib
A Joint Framework for Coreference Resolution and Mention Head Detection
Haoruo Peng
|
Kai-Wei Chang
|
Dan Roth
Proceedings of the Nineteenth Conference on Computational Natural Language Learning
pdf
Improving a Pipeline Architecture for Shallow Discourse Parsing
Yangqiu Song
|
Haoruo Peng
|
Parisa Kordjamshidi
|
Mark Sammons
|
Dan Roth
Proceedings of the Nineteenth Conference on Computational Natural Language Learning - Shared Task