Stephen Clark


2020

pdf
Representation Learning for Type-Driven Composition
Gijs Wijnholds | Mehrnoosh Sadrzadeh | Stephen Clark
Proceedings of the 24th Conference on Computational Natural Language Learning

This paper is about learning word representations using grammatical type information. We use the syntactic types of Combinatory Categorial Grammar to develop multilinear representations, i.e. maps with n arguments, for words with different functional types. The multilinear maps of words compose with each other to form sentence representations. We extend the skipgram algorithm from vectors to multi- linear maps to learn these representations and instantiate it on unary and binary maps for transitive verbs. These are evaluated on verb and sentence similarity and disambiguation tasks and a subset of the SICK relatedness dataset. Our model performs better than previous type- driven models and is competitive with state of the art representation learning methods such as BERT and neural sentence encoders.

pdf
Learning to Segment Actions from Observation and Narration
Daniel Fried | Jean-Baptiste Alayrac | Phil Blunsom | Chris Dyer | Stephen Clark | Aida Nematzadeh
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

We apply a generative segmental model of task structure, guided by narration, to action segmentation in video. We focus on unsupervised and weakly-supervised settings where no action labels are known during training. Despite its simplicity, our model performs competitively with previous work on a dataset of naturalistic instructional videos. Our model allows us to vary the sources of supervision used in training, and we find that both task structure and narrative language provide large benefits in segmentation quality.

2019

pdf
Factorising AMR generation through syntax
Kris Cao | Stephen Clark
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)

Generating from Abstract Meaning Representation (AMR) is an underspecified problem, as many syntactic decisions are not specified by the semantic graph. To explicitly account for this variation, we break down generating from AMR into two steps: first generate a syntactic structure, and then generate the surface form. We show that decomposing the generation process this way leads to state-of-the-art single model performance generating from AMR without additional unlabelled data. We also demonstrate that we can generate meaning-preserving syntactic paraphrases of the same AMR graph, as judged by humans.

pdf
Scalable Syntax-Aware Language Models Using Knowledge Distillation
Adhiguna Kuncoro | Chris Dyer | Laura Rimell | Stephen Clark | Phil Blunsom
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics

Prior work has shown that, on small amounts of training data, syntactic neural language models learn structurally sensitive generalisations more successfully than sequential language models. However, their computational complexity renders scaling difficult, and it remains an open question whether structural biases are still necessary when sequential models have access to ever larger amounts of training data. To answer this question, we introduce an efficient knowledge distillation (KD) technique that transfers knowledge from a syntactic language model trained on a small corpus to an LSTM language model, hence enabling the LSTM to develop a more structurally sensitive representation of the larger training data it learns from. On targeted syntactic evaluations, we find that, while sequential LSTMs perform much better than previously reported, our proposed technique substantially improves on this baseline, yielding a new state of the art. Our findings and analysis affirm the importance of structural biases, even in models that learn from large amounts of data.

pdf
Neural Generative Rhetorical Structure Parsing
Amandla Mabona | Laura Rimell | Stephen Clark | Andreas Vlachos
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)

Rhetorical structure trees have been shown to be useful for several document-level tasks including summarization and document classification. Previous approaches to RST parsing have used discriminative models; however, these are less sample efficient than generative models, and RST parsing datasets are typically small. In this paper, we present the first generative model for RST parsing. Our model is a document-level RNN grammar (RNNG) with a bottom-up traversal order. We show that, for our parser’s traversal order, previous beam search algorithms for RNNGs have a left-branching bias which is ill-suited for RST parsing.We develop a novel beam search algorithm that keeps track of both structure-and word-generating actions without exhibit-ing this branching bias and results in absolute improvements of 6.8 and 2.9 on unlabelled and labelled F1 over previous algorithms. Overall, our generative model outperforms a discriminative model with the same features by 2.6 F1points and achieves performance comparable to the state-of-the-art, outperforming all published parsers from a recent replication study that do not use additional training data

2018

pdf
Latent Tree Learning with Differentiable Parsers: Shift-Reduce Parsing and Chart Parsing
Jean Maillard | Stephen Clark
Proceedings of the Workshop on the Relevance of Linguistic Structure in Neural Architectures for NLP

Latent tree learning models represent sentences by composing their words according to an induced parse tree, all based on a downstream task. These models often outperform baselines which use (externally provided) syntax trees to drive the composition order. This work contributes (a) a new latent tree learning model based on shift-reduce parsing, with competitive downstream performance and non-trivial induced trees, and (b) an analysis of the trees learned by our shift-reduce model and by a chart-based model.

pdf
LSTMs Can Learn Syntax-Sensitive Dependencies Well, But Modeling Structure Makes Them Better
Adhiguna Kuncoro | Chris Dyer | John Hale | Dani Yogatama | Stephen Clark | Phil Blunsom
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

Language exhibits hierarchical structure, but recent work using a subject-verb agreement diagnostic argued that state-of-the-art language models, LSTMs, fail to learn long-range syntax sensitive dependencies. Using the same diagnostic, we show that, in fact, LSTMs do succeed in learning such dependencies—provided they have enough capacity. We then explore whether models that have access to explicit syntactic information learn agreement more effectively, and how the way in which this structural information is incorporated into the model impacts performance. We find that the mere presence of syntactic information does not improve accuracy, but when model architecture is determined by syntax, number agreement is improved. Further, we find that the choice of how syntactic structure is built affects how well number agreement is learned: top-down construction outperforms left-corner and bottom-up variants in capturing non-local structural dependencies.

2017

pdf bib
Visually Grounded and Textual Semantic Models Differentially Decode Brain Activity Associated with Concrete and Abstract Nouns
Andrew J. Anderson | Douwe Kiela | Stephen Clark | Massimo Poesio
Transactions of the Association for Computational Linguistics, Volume 5

Important advances have recently been made using computational semantic models to decode brain activity patterns associated with concepts; however, this work has almost exclusively focused on concrete nouns. How well these models extend to decoding abstract nouns is largely unknown. We address this question by applying state-of-the-art computational models to decode functional Magnetic Resonance Imaging (fMRI) activity patterns, elicited by participants reading and imagining a diverse set of both concrete and abstract nouns. One of the models we use is linguistic, exploiting the recent word2vec skipgram approach trained on Wikipedia. The second is visually grounded, using deep convolutional neural networks trained on Google Images. Dual coding theory considers concrete concepts to be encoded in the brain both linguistically and visually, and abstract concepts only linguistically. Splitting the fMRI data according to human concreteness ratings, we indeed observe that both models significantly decode the most concrete nouns; however, accuracy is significantly greater using the text-based models for the most abstract nouns. More generally this confirms that current computational models are sufficiently advanced to assist in investigating the representational structure of abstract concepts in the brain.

pdf
Introducing Structure into Neural Network-Based Semantic Models
Stephen Clark
Proceedings of the 15th Meeting on the Mathematics of Language

pdf
Speaking, Seeing, Understanding: Correlating semantic models with conceptual representation in the brain
Luana Bulat | Stephen Clark | Ekaterina Shutova
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing

Research in computational semantics is increasingly guided by our understanding of human semantic processing. However, semantic models are typically studied in the context of natural language processing system performance. In this paper, we present a systematic evaluation and comparison of a range of widely-used, state-of-the-art semantic models in their ability to predict patterns of conceptual representation in the human brain. Our results provide new insights both for the design of computational semantic models and for further research in cognitive neuroscience.

pdf
Latent Variable Dialogue Models and their Diversity
Kris Cao | Stephen Clark
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers

We present a dialogue generation model that directly captures the variability in possible responses to a given input, which reduces the ‘boring output’ issue of deterministic dialogue models. Experiments show that our model generates more diverse outputs than baseline models, and also generates more consistently acceptable output than sampling from a deterministic encoder-decoder model.

pdf
Modelling metaphor with attribute-based semantics
Luana Bulat | Stephen Clark | Ekaterina Shutova
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers

One of the key problems in computational metaphor modelling is finding the optimal level of abstraction of semantic representations, such that these are able to capture and generalise metaphorical mechanisms. In this paper we present the first metaphor identification method that uses representations constructed from property norms. Such norms have been previously shown to provide a cognitively plausible representation of concepts in terms of semantic properties. Our results demonstrate that such property-based semantic representations provide a suitable model of cross-domain knowledge projection in metaphors, outperforming standard distributional models on a metaphor identification task.

2016

pdf
Comparing Data Sources and Architectures for Deep Visual Representation Learning in Semantics
Douwe Kiela | Anita Lilla Verő | Stephen Clark
Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing

pdf
Multi-Modal Representations for Improved Bilingual Lexicon Learning
Ivan Vulić | Douwe Kiela | Stephen Clark | Marie-Francine Moens
Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

pdf
Expected F-Measure Training for Shift-Reduce Parsing with Recurrent Neural Networks
Wenduan Xu | Michael Auli | Stephen Clark
Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies

pdf
Vision and Feature Norms: Improving automatic feature norm learning through cross-modal maps
Luana Bulat | Douwe Kiela | Stephen Clark
Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies

pdf
RELPRON: A Relative Clause Evaluation Data Set for Compositional Distributional Semantics
Laura Rimell | Jean Maillard | Tamara Polajnar | Stephen Clark
Computational Linguistics, Volume 42, Issue 4 - December 2016

2015

pdf
Discriminative Syntax-Based Word Ordering for Text Generation
Yue Zhang | Stephen Clark
Computational Linguistics, Volume 41, Issue 3 - September 2015

pdf
Visual Bilingual Lexicon Induction with Transferred ConvNet Features
Douwe Kiela | Ivan Vulić | Stephen Clark
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing

pdf
Specializing Word Embeddings for Similarity or Relatedness
Douwe Kiela | Felix Hill | Stephen Clark
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing

pdf
Multi- and Cross-Modal Semantics Beyond Vision: Grounding in Auditory Perception
Douwe Kiela | Stephen Clark
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing

pdf
Exploiting Image Generality for Lexical Entailment Detection
Douwe Kiela | Laura Rimell | Ivan Vulić | Stephen Clark
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)

pdf
Grounding Semantics in Olfactory Perception
Douwe Kiela | Luana Bulat | Stephen Clark
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)

pdf
CCG Supertagging with a Recurrent Neural Network
Wenduan Xu | Michael Auli | Stephen Clark
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)

pdf
Low-Rank Tensors for Verbs in Compositional Distributional Semantics
Daniel Fried | Tamara Polajnar | Stephen Clark
Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)

pdf
Learning Adjective Meanings with a Tensor-Based Skip-Gram Model
Jean Maillard | Stephen Clark
Proceedings of the Nineteenth Conference on Computational Natural Language Learning

pdf
From distributional semantics to feature norms: grounding semantic models in human perceptual data
Luana Fagarasan | Eva Maria Vecchi | Stephen Clark
Proceedings of the 11th International Conference on Computational Semantics

pdf bib
An Exploration of Discourse-Based Sentence Spaces for Compositional Distributional Semantics
Tamara Polajnar | Laura Rimell | Stephen Clark
Proceedings of the First Workshop on Linking Computational Models of Lexical, Sentential and Discourse-level Semantics

2014

pdf
Shift-Reduce CCG Parsing with a Dependency Model
Wenduan Xu | Stephen Clark | Yue Zhang
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

pdf
Improving Multi-Modal Representations Using Image Dispersion: Why Less is Sometimes More
Douwe Kiela | Felix Hill | Anna Korhonen | Stephen Clark
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

pdf
A Type-Driven Tensor-Based Semantics for CCG
Jean Maillard | Stephen Clark | Edward Grefenstette
Proceedings of the EACL 2014 Workshop on Type Theory and Natural Language Semantics (TTNLS)

pdf
A Systematic Study of Semantic Vector Space Model Parameters
Douwe Kiela | Stephen Clark
Proceedings of the 2nd Workshop on Continuous Vector Space Models and their Compositionality (CVSC)

pdf bib
Application-Driven Relation Extraction with Limited Distant Supervision
Andreas Vlachos | Stephen Clark
Proceedings of the First AHA!-Workshop on Information Discovery in Text

pdf
Practical Linguistic Steganography using Contextual Synonym Substitution and a Novel Vertex Coding Method
Ching-Yun Chang | Stephen Clark
Computational Linguistics, Volume 40, Issue 2 - June 2014

pdf
Improving Distributional Semantic Vectors through Context Selection and Normalisation
Tamara Polajnar | Stephen Clark
Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics

pdf
Reducing Dimensions of Tensors in Type-Driven Distributional Semantics
Tamara Polajnar | Luana Fǎgǎrǎşan | Stephen Clark
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)

pdf
Evaluation of Simple Distributional Compositional Operations on Longer Texts
Tamara Polajnar | Laura Rimell | Stephen Clark
Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC'14)

Distributional semantic models have been effective at representing linguistic semantics at the word level, and more recently research has moved on to the construction of distributional representations for larger segments of text. However, it is not well understood how the composition operators that work well on short phrase-based models scale up to full-length sentences. In this paper we test several simple compositional methods on a sentence-length similarity task and discover that their performance peaks at fewer than ten operations. We also introduce a novel sentence segmentation method that reduces the number of compositional operations.

pdf
A New Corpus and Imitation Learning Framework for Context-Dependent Semantic Parsing
Andreas Vlachos | Stephen Clark
Transactions of the Association for Computational Linguistics, Volume 2

Semantic parsing is the task of translating natural language utterances into a machine-interpretable meaning representation. Most approaches to this task have been evaluated on a small number of existing corpora which assume that all utterances must be interpreted according to a database and typically ignore context. In this paper we present a new, publicly available corpus for context-dependent semantic parsing. The MRL used for the annotation was designed to support a portable, interactive tourist information system. We develop a semantic parser for this corpus by adapting the imitation learning algorithm DAgger without requiring alignment information during training. DAgger improves upon independently trained classifiers by 9.0 and 4.8 points in F-score on the development and test sets respectively.

2013

pdf
The Frobenius Anatomy of Relative Pronouns
Stephen Clark | Bob Coecke | Mehrnoosh Sadrzadeh
Proceedings of the 13th Meeting on the Mathematics of Language (MoL 13)

pdf
Detecting Compositionality of Multi-Word Expressions using Nearest Neighbours in Vector Space Models
Douwe Kiela | Stephen Clark
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing

pdf
Semantic Parsing as Machine Translation
Jacob Andreas | Andreas Vlachos | Stephen Clark
Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

2012

pdf
Adjective Deletion for Linguistic Steganography and Secret Sharing
Ching-Yun Chang | Stephen Clark
Proceedings of COLING 2012

pdf
The Secret’s in the Word Order: Text-to-Text Generation for Linguistic Steganography
Ching-Yun Chang | Stephen Clark
Proceedings of COLING 2012

pdf
Syntax-Based Word Ordering Incorporating a Large-Scale Language Model
Yue Zhang | Graeme Blackwood | Stephen Clark
Proceedings of the 13th Conference of the European Chapter of the Association for Computational Linguistics

2011

pdf
Syntax-Based Grammaticality Improvement using CCG and Guided Search
Yue Zhang | Stephen Clark
Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing

pdf
Concrete Sentence Spaces for Compositional Distributional Models of Meaning
Edward Grefenstette | Mehrnoosh Sadrzadeh | Stephen Clark | Bob Coecke | Stephen Pulman
Proceedings of the Ninth International Conference on Computational Semantics (IWCS 2011)

pdf
Syntactic Processing Using the Generalized Perceptron and Beam Search
Yue Zhang | Stephen Clark
Computational Linguistics, Volume 37, Issue 1 - March 2011

pdf
Shift-Reduce CCG Parsing
Yue Zhang | Stephen Clark
Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies

2010

pdf bib
Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Jan Hajič | Sandra Carberry | Stephen Clark | Joakim Nivre
Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics

pdf
Faster Parsing by Supertagger Adaptation
Jonathan K. Kummerfeld | Jessika Roesner | Tim Dawborn | James Haggerty | James R. Curran | Stephen Clark
Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics

pdf bib
Proceedings of the ACL 2010 Conference Short Papers
Jan Hajič | Sandra Carberry | Stephen Clark | Joakim Nivre
Proceedings of the ACL 2010 Conference Short Papers

pdf
A Fast Decoder for Joint Word Segmentation and POS-Tagging Using a Single Discriminative Model
Yue Zhang | Stephen Clark
Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing

pdf
Practical Linguistic Steganography Using Contextual Synonym Substitution and Vertex Colour Coding
Ching-Yun Chang | Stephen Clark
Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing

pdf
Chart Pruning for Fast Lexicalised-Grammar Parsing
Yue Zhang | Byung-Gyu Ahn | Stephen Clark | Curt Van Wyk | James R. Curran | Laura Rimell
Coling 2010: Posters

pdf
Cambridge: Parser Evaluation Using Textual Entailment by Grammatical Relation Comparison
Laura Rimell | Stephen Clark
Proceedings of the 5th International Workshop on Semantic Evaluation

pdf
Linguistic Steganography Using Automatically Generated Paraphrases
Ching-Yun Chang | Stephen Clark
Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics

2009

pdf
Unbounded Dependency Recovery for Parser Evaluation
Laura Rimell | Stephen Clark | Mark Steedman
Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing

pdf
Transition-Based Parsing of the Chinese Treebank using a Global Discriminative Model
Yue Zhang | Stephen Clark
Proceedings of the 11th International Conference on Parsing Technologies (IWPT’09)

pdf
Comparing the Accuracy of CCG and Penn Treebank Parsers
Stephen Clark | James R. Curran
Proceedings of the ACL-IJCNLP 2009 Conference Short Papers

2008

pdf
Joint Word Segmentation and POS Tagging Using a Single Perceptron
Yue Zhang | Stephen Clark
Proceedings of ACL-08: HLT

pdf bib
Coling 2008: Proceedings of the workshop on Cross-Framework and Cross-Domain Parser Evaluation
Johan Bos | Edward Briscoe | Aoife Cahill | John Carroll | Stephen Clark | Ann Copestake | Dan Flickinger | Josef van Genabith | Julia Hockenmaier | Aravind Joshi | Ronald Kaplan | Tracy Holloway King | Sandra Kuebler | Dekang Lin | Jan Tore Lønning | Christopher Manning | Yusuke Miyao | Joakim Nivre | Stephan Oepen | Kenji Sagae | Nianwen Xue | Yi Zhang
Coling 2008: Proceedings of the workshop on Cross-Framework and Cross-Domain Parser Evaluation

pdf
Constructing a Parser Evaluation Scheme
Laura Rimell | Stephen Clark
Coling 2008: Proceedings of the workshop on Cross-Framework and Cross-Domain Parser Evaluation

pdf bib
Coling 2008: Proceedings of the workshop on Grammar Engineering Across Frameworks
Stephen Clark | Tracy Holloway King
Coling 2008: Proceedings of the workshop on Grammar Engineering Across Frameworks

pdf
Adapting a Lexicalized-Grammar Parser to Contrasting Domains
Laura Rimell | Stephen Clark
Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing

pdf
A Tale of Two Parsers: Investigating and Combining Graph-based and Transition-based Dependency Parsing
Yue Zhang | Stephen Clark
Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing

2007

pdf bib
Perceptron Training for a Wide-Coverage Lexicalized-Grammar Parser
Stephen Clark | James Curran
ACL 2007 Workshop on Deep Linguistic Processing

pdf
Improving the Efficiency of a Wide-Coverage CCG Parser
Bojan Djordjevic | James Curran | Stephen Clark
Proceedings of the Tenth International Conference on Parsing Technologies

pdf
Chinese Segmentation with a Word-Based Perceptron Algorithm
Yue Zhang | Stephen Clark
Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics

pdf
Formalism-Independent Parser Evaluation with CCG and DepBank
Stephen Clark | James Curran
Proceedings of the 45th Annual Meeting of the Association of Computational Linguistics

pdf
Linguistically Motivated Large-Scale NLP with C&C and Boxer
James Curran | Stephen Clark | Johan Bos
Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics Companion Volume Proceedings of the Demo and Poster Sessions

pdf
Wide-Coverage Efficient Statistical Parsing with CCG and Log-Linear Models
Stephen Clark | James R. Curran
Computational Linguistics, Volume 33, Number 4, December 2007

2006

pdf
Multi-Tagging for Lexicalized-Grammar Parsing
James R. Curran | Stephen Clark | David Vadas
Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics

pdf
Partial Training for a Lexicalized-Grammar Parser
Stephen Clark | James Curran
Proceedings of the Human Language Technology Conference of the NAACL, Main Conference

2004

pdf
The Importance of Supertagging for Wide-Coverage CCG Parsing
Stephen Clark | James R. Curran
COLING 2004: Proceedings of the 20th International Conference on Computational Linguistics

pdf
Wide-Coverage Semantic Representations from a CCG Parser
Johan Bos | Stephen Clark | Mark Steedman | James R. Curran | Julia Hockenmaier
COLING 2004: Proceedings of the 20th International Conference on Computational Linguistics

pdf
Object-Extraction and Question-Parsing using CCG
Stephen Clark | Mark Steedman | James R. Curran
Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing

pdf
Parsing the WSJ Using CCG and Log-Linear Models
Stephen Clark | James R. Curran
Proceedings of the 42nd Annual Meeting of the Association for Computational Linguistics (ACL-04)

2003

pdf
Bootstrapping POS-taggers using unlabelled data
Stephen Clark | James Curran | Miles Osborne
Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL 2003

pdf
Language Independent NER using a Maximum Entropy Tagger
James Curran | Stephen Clark
Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL 2003

pdf
Log-Linear Models for Wide-Coverage CCG Parsing
Stephen Clark | James Curran
Proceedings of the 2003 Conference on Empirical Methods in Natural Language Processing

pdf
Bootstrapping statistical parsers from small datasets
Mark Steedman | Miles Osborne | Anoop Sarkar | Stephen Clark | Rebecca Hwa | Julia Hockenmaier | Paul Ruhlen | Steven Baker | Jeremiah Crim
10th Conference of the European Chapter of the Association for Computational Linguistics

pdf
Investigating GIS and Smoothing for Maximum Entropy Taggers
James R. Curran | Stephen Clark
10th Conference of the European Chapter of the Association for Computational Linguistics

pdf
Example Selection for Bootstrapping Statistical Parsers
Mark Steedman | Rebecca Hwa | Stephen Clark | Miles Osborne | Anoop Sarkar | Julia Hockenmaier | Paul Ruhlen | Steven Baker | Jeremiah Crim
Proceedings of the 2003 Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics

2002

pdf
Building Deep Dependency Structures using a Wide-Coverage CCG Parser
Stephen Clark | Julia Hockenmaier | Mark Steedman
Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics

pdf
Class-Based Probability Estimation Using a Semantic Hierarchy
Stephen Clark | David Weir
Computational Linguistics, Volume 28, Number 2, June 2002

pdf
Supertagging for Combinatory Categorial Grammar
Stephen Clark
Proceedings of the Sixth International Workshop on Tree Adjoining Grammar and Related Frameworks (TAG+6)

2001

pdf
Class-Based Probability Estimation Using a Semantic Hierarchy
Stephen Clark | David Weir
Second Meeting of the North American Chapter of the Association for Computational Linguistics

2000

pdf
A Class-based Probabilistic approach to Structural Disambiguation
Stephen Clark | David Weir
COLING 2000 Volume 1: The 18th International Conference on Computational Linguistics

1999

pdf
An Iterative Approach to Estimating Frequencies over a Semantic Hierarchy
Stephen Clark | David Weir
1999 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora