Guy Emerson


2024

pdf
Colour Me Uncertain: Representing Vagueness with Probabilistic Semantics
Kin Chun Cheung | Guy Emerson
Proceedings of the Third Workshop on Understanding Implicit and Underspecified Language

People successfully communicate in everyday situations using vague language. In particular, colour terms have no clear boundaries as to the ranges of colours they describe. We model people’s reasoning process in a dyadic reference game using the Rational Speech Acts (RSA) framework and probabilistic semantics, and we find that the implementation of probabilistic semantics requires a modification from pure theory to perform well on real-world data. In addition, we explore approaches to handling target disagreements in reference games, an issue that is rarely discussed in the RSA literature.

pdf bib
UG-schematic Annotation for Event Nominals: A Case Study in Mandarin Chinese
Wenxi Li | Yutong Zhang | Guy Emerson | Weiwei Sun
Computational Linguistics, Volume 50, Issue 2 - June 2023

Divergence of languages observed at the surface level is a major challenge encountered by multilingual data representation, especially when typologically distant languages are involved. Drawing inspiration from a formalist Chomskyan perspective towards language universals, Universal Grammar (UG), this article uses deductively pre-defined universals to analyze a multilingually heterogeneous phenomenon, event nominals. In this way, deeper universality of event nominals beneath their huge divergence in different languages is uncovered, which empowers us to break barriers between languages and thus extend insights from some synthetic languages to a non-inflectional language, Mandarin Chinese. Our empirical investigation also demonstrates this UG-inspired schema is effective: With its assistance, the inter-annotator agreement (IAA) for identifying event nominals in Mandarin grows from 88.02% to 94.99%, and automatic detection of event-reading nominalizations on the newly-established data achieves an accuracy of 94.76% and an F1 score of 91.3%, which significantly surpass those achieved on the pre-existing resource by 9.8% and 5.2%, respectively. Our systematic analysis also sheds light on nominal semantic role labeling. By providing a clear definition and classification on arguments of event nominal, the IAA of this task significantly increases from 90.46% to 98.04%.

2023

pdf
Functional Distributional Semantics at Scale
Chun Hei Lo | Hong Cheng | Wai Lam | Guy Emerson
Proceedings of the 12th Joint Conference on Lexical and Computational Semantics (*SEM 2023)

Functional Distributional Semantics is a linguistically motivated framework for modelling lexical and sentence-level semantics with truth-conditional functions using distributional information. Previous implementations of the framework focus on subjectverbobject (SVO) triples only, which largely limits the contextual information available for training and thus the capability of the learnt model. In this paper, we discuss the challenges of extending the previous architectures to training on arbitrary sentences. We address the challenges by proposing a more expressive lexical model that works over a continuous semantic space. This improves the flexibility and computational efficiency of the model, as well as its compatibility with present-day machine-learning frameworks. Our proposal allows the model to be applied to a wider range of semantic tasks, and improved performances are demonstrated from experimental results.

pdf
Are Embedded Potatoes Still Vegetables? On the Limitations of WordNet Embeddings for Lexical Semantics
Xuyou Cheng | Michael Schlichtkrull | Guy Emerson
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing

Knowledge Base Embedding (KBE) models have been widely used to encode structured information from knowledge bases, including WordNet. However, the existing literature has predominantly focused on link prediction as the evaluation task, often neglecting exploration of the models’ semantic capabilities. In this paper, we investigate the potential disconnect between the performance of KBE models of WordNet on link prediction and their ability to encode semantic information, highlighting the limitations of current evaluation protocols. Our findings reveal that some top-performing KBE models on the WN18RR benchmark exhibit subpar results on two semantic tasks and two downstream tasks. These results demonstrate the inadequacy of link prediction benchmarks for evaluating the semantic capabilities of KBE models, suggesting the need for a more targeted assessment approach.

pdf
Visual Spatial Reasoning
Fangyu Liu | Guy Emerson | Nigel Collier
Transactions of the Association for Computational Linguistics, Volume 11

Spatial relations are a basic part of human cognition. However, they are expressed in natural language in a variety of ways, and previous work has suggested that current vision-and-language models (VLMs) struggle to capture relational information. In this paper, we present Visual Spatial Reasoning (VSR), a dataset containing more than 10k natural text-image pairs with 66 types of spatial relations in English (e.g., under, in front of, facing). While using a seemingly simple annotation format, we show how the dataset includes challenging linguistic phenomena, such as varying reference frames. We demonstrate a large gap between human and model performance: The human ceiling is above 95%, while state-of-the-art models only achieve around 70%. We observe that VLMs’ by-relation performances have little correlation with the number of training examples and the tested models are in general incapable of recognising relations concerning the orientations of objects.1

2022

pdf bib
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)
Guy Emerson | Natalie Schluter | Gabriel Stanovsky | Ritesh Kumar | Alexis Palmer | Nathan Schneider | Siddharth Singh | Shyam Ratan
Proceedings of the 16th International Workshop on Semantic Evaluation (SemEval-2022)

pdf
Learning Functional Distributional Semantics with Visual Data
Yinhong Liu | Guy Emerson
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

Functional Distributional Semantics is a recently proposed framework for learning distributional semantics that provides linguistic interpretability. It models the meaning of a word as a binary classifier rather than a numerical vector. In this work, we propose a method to train a Functional Distributional Semantics model with grounded visual data. We train it on the Visual Genome dataset, which is closer to the kind of data encountered in human language acquisition than a large text corpus. On four external evaluation datasets, our model outperforms previous work on learning semantics from Visual Genome.

pdf
Using dependency parsing for few-shot learning in distributional semantics
Stefania Preda | Guy Emerson
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop

In this work, we explore the novel idea of employing dependency parsing information in the context of few-shot learning, the task of learning the meaning of a rare word based on a limited amount of context sentences. Firstly, we use dependency-based word embedding models as background spaces for few-shot learning. Secondly, we introduce two few-shot learning methods which enhance the additive baseline model by using dependencies.

2021

pdf
Incremental Beam Manipulation for Natural Language Generation
James Hargreaves | Andreas Vlachos | Guy Emerson
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume

The performance of natural language generation systems has improved substantially with modern neural networks. At test time they typically employ beam search to avoid locally optimal but globally suboptimal predictions. However, due to model errors, a larger beam size can lead to deteriorating performance according to the evaluation metric. For this reason, it is common to rerank the output of beam search, but this relies on beam search to produce a good set of hypotheses, which limits the potential gains. Other alternatives to beam search require changes to the training of the model, which restricts their applicability compared to beam search. This paper proposes incremental beam manipulation, i.e. reranking the hypotheses in the beam during decoding instead of only at the end. This way, hypotheses that are unlikely to lead to a good final output are discarded, and in their place hypotheses that would have been ignored will be considered instead. Applying incremental beam manipulation leads to an improvement of 1.93 and 5.82 BLEU points over vanilla beam search for the test sets of the E2E and WebNLG challenges respectively. The proposed method also outperformed a strong reranker by 1.04 BLEU points on the E2E challenge, while being on par with it on the WebNLG dataset.

pdf bib
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021)
Alexis Palmer | Nathan Schneider | Natalie Schluter | Guy Emerson | Aurelie Herbelot | Xiaodan Zhu
Proceedings of the 15th International Workshop on Semantic Evaluation (SemEval-2021)

2020

pdf
Leveraging Sentence Similarity in Natural Language Generation: Improving Beam Search using Range Voting
Sebastian Borgeaud | Guy Emerson
Proceedings of the Fourth Workshop on Neural Generation and Translation

We propose a method for natural language generation, choosing the most representative output rather than the most likely output. By viewing the language generation process from the voting theory perspective, we define representativeness using range voting and a similarity measure. The proposed method can be applied when generating from any probabilistic language model, including n-gram models and neural network models. We evaluate different similarity measures on an image captioning task and a machine translation task, and show that our method generates longer and more diverse sentences, providing a solution to the common problem of short outputs being preferred over longer and more informative ones. The generated sentences obtain higher BLEU scores, particularly when the beam size is large. We also perform a human evaluation on both tasks and find that the outputs generated using our method are rated higher.

pdf
Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics
Guy Emerson
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

Functional Distributional Semantics provides a linguistically interpretable framework for distributional semantics, by representing the meaning of a word as a function (a binary classifier), instead of a vector. However, the large number of latent variables means that inference is computationally expensive, and training a model is therefore slow to converge. In this paper, I introduce the Pixie Autoencoder, which augments the generative model of Functional Distributional Semantics with a graph-convolutional neural network to perform amortised variational inference. This allows the model to be trained more effectively, achieving better results on two tasks (semantic similarity in context and semantic composition), and outperforming BERT, a large pre-trained language model.

pdf
What are the Goals of Distributional Semantics?
Guy Emerson
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

Distributional semantic models have become a mainstay in NLP, providing useful features for downstream tasks. However, assessing long-term progress requires explicit long-term goals. In this paper, I take a broad linguistic perspective, looking at how well current models can deal with various semantic challenges. Given stark differences between models proposed in different subfields, a broad perspective is needed to see how we could integrate them. I conclude that, while linguistic insights can guide the design of model architectures, future progress will require balancing the often conflicting demands of linguistic expressiveness and computational tractability.

pdf
Investigating Cross-Linguistic Adjective Ordering Tendencies with a Latent-Variable Model
Jun Yen Leung | Guy Emerson | Ryan Cotterell
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Across languages, multiple consecutive adjectives modifying a noun (e.g. “the big red dog”) follow certain unmarked ordering rules. While explanatory accounts have been put forward, much of the work done in this area has relied primarily on the intuitive judgment of native speakers, rather than on corpus data. We present the first purely corpus-driven model of multi-lingual adjective ordering in the form of a latent-variable model that can accurately order adjectives across 24 different languages, even when the training and testing languages are different. We utilize this novel statistical model to provide strong converging evidence for the existence of universal, cross-linguistic, hierarchical adjective ordering tendencies.

pdf
Linguists Who Use Probabilistic Models Love Them: Quantification in Functional Distributional Semantics
Guy Emerson
Proceedings of the Probability and Meaning Conference (PaM 2020)

Functional Distributional Semantics provides a computationally tractable framework for learning truth-conditional semantics from a corpus. Previous work in this framework has provided a probabilistic version of first-order logic, recasting quantification as Bayesian inference. In this paper, I show how the previous formulation gives trivial truth values when a precise quantifier is used with vague predicates. I propose an improved account, avoiding this problem by treating a vague predicate as a distribution over precise predicates. I connect this account to recent work in the Rational Speech Acts framework on modelling generic quantification, and I extend this to modelling donkey sentences. Finally, I explain how the generic quantifier can be both pragmatically complex and yet computationally simpler than precise quantifiers.

2019

pdf
Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs
Paula Czarnowska | Guy Emerson | Ann Copestake
Proceedings of the 13th International Conference on Computational Semantics - Long Papers

Distributional Semantic Models (DSMs) construct vector representations of word meanings based on their contexts. Typically, the contexts of a word are defined as its closest neighbours, but they can also be retrieved from its syntactic dependency relations. In this work, we propose a new dependency-based DSM. The novelty of our model lies in associating an independent meaning representation, a matrix, with each dependency-label. This allows it to capture specifics of the relations between words and contexts, leading to good performance on both intrinsic and extrinsic evaluation tasks. In addition to that, our model has an inherent ability to represent dependency chains as products of matrices which provides a straightforward way of handling further contexts of a word.

pdf
Bad Form: Comparing Context-Based and Form-Based Few-Shot Learning in Distributional Semantic Models
Jeroen Van Hautte | Guy Emerson | Marek Rei
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)

Word embeddings are an essential component in a wide range of natural language processing applications. However, distributional semantic models are known to struggle when only a small number of context sentences are available. Several methods have been proposed to obtain higher-quality vectors for these words, leveraging both this context information and sometimes the word forms themselves through a hybrid approach. We show that the current tasks do not suffice to evaluate models that use word-form information, as such models can easily leverage word forms in the training data that are related to word forms in the test data. We introduce 3 new tasks, allowing for a more balanced comparison between models. Furthermore, we show that hyperparameters that have largely been ignored in previous work can consistently improve the performance of both baseline and advanced models, achieving a new state of the art on 4 out of 6 tasks.

2017

pdf
Semantic Composition via Probabilistic Model Theory
Guy Emerson | Ann Copestake
Proceedings of the 12th International Conference on Computational Semantics (IWCS) — Long papers

2016

pdf
Functional Distributional Semantics
Guy Emerson | Ann Copestake
Proceedings of the 1st Workshop on Representation Learning for NLP

pdf
Resources for building applications with Dependency Minimal Recursion Semantics
Ann Copestake | Guy Emerson | Michael Wayne Goodman | Matic Horvat | Alexander Kuhnle | Ewa Muszyńska
Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC'16)

We describe resources aimed at increasing the usability of the semantic representations utilized within the DELPH-IN (Deep Linguistic Processing with HPSG) consortium. We concentrate in particular on the Dependency Minimal Recursion Semantics (DMRS) formalism, a graph-based representation designed for compositional semantic representation with deep grammars. Our main focus is on English, and specifically English Resource Semantics (ERS) as used in the English Resource Grammar. We first give an introduction to ERS and DMRS and a brief overview of some existing resources and then describe in detail a new repository which has been developed to simplify the use of ERS/DMRS. We explain a number of operations on DMRS graphs which our repository supports, with sketches of the algorithms, and illustrate how these operations can be exploited in application building. We believe that this work will aid researchers to exploit the rich and effective but complex DELPH-IN resources.

2015

pdf bib
Leveraging a Semantically Annotated Corpus to Disambiguate Prepositional Phrase Attachment
Guy Emerson | Ann Copestake
Proceedings of the 11th International Conference on Computational Semantics

2014

pdf
SeedLing: Building and Using a Seed corpus for the Human Language Project
Guy Emerson | Liling Tan | Susanne Fertmann | Alexis Palmer | Michaela Regneri
Proceedings of the 2014 Workshop on the Use of Computational Methods in the Study of Endangered Languages

pdf
SentiMerge: Combining Sentiment Lexicons in a Bayesian Framework
Guy Emerson | Thierry Declerck
Proceedings of Workshop on Lexical and Grammatical Resources for Language Processing