Anna N. Rafferty

Also published as: Anna Rafferty


2020

pdf bib
Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages
Tyler A. Chang | Anna Rafferty
Proceedings of the 5th Workshop on Representation Learning for NLP

We train neural machine translation (NMT) models from English to six target languages, using NMT encoder representations to predict ancestor constituent labels of source language words. We find that NMT encoders learn similar source syntax regardless of NMT target language, relying on explicit morphosyntactic cues to extract syntactic features from source sentences. Furthermore, the NMT encoders outperform RNNs trained directly on several of the constituent label prediction tasks, suggesting that NMT encoder representations can be used effectively for natural language tasks involving syntax. However, both the NMT encoders and the directly-trained RNNs learn substantially different syntactic information from a probabilistic context-free grammar (PCFG) parser. Despite lower overall accuracy scores, the PCFG often performs well on sentences for which the RNN-based models perform poorly, suggesting that RNN architectures are constrained in the types of syntax they can learn.

2011

pdf bib
Exploring the Relationship Between Learnability and Linguistic Universals
Anna N. Rafferty | Thomas L. Griffiths | Marc Ettlinger
Proceedings of the 2nd Workshop on Cognitive Modeling and Computational Linguistics

2009

pdf bib
Random Walks for Text Semantic Similarity
Daniel Ramage | Anna N. Rafferty | Christopher D. Manning
Proceedings of the 2009 Workshop on Graph-based Methods for Natural Language Processing (TextGraphs-4)

2008

pdf bib
Parsing Three German Treebanks: Lexicalized and Unlexicalized Baselines
Anna Rafferty | Christopher D. Manning
Proceedings of the Workshop on Parsing German

pdf bib
Finding Contradictions in Text
Marie-Catherine de Marneffe | Anna N. Rafferty | Christopher D. Manning
Proceedings of ACL-08: HLT