Dana Angluin


2018

pdf bib
Context-Free Transductions with Neural Stacks
Yiding Hao | William Merrill | Dana Angluin | Robert Frank | Noah Amsel | Andrew Benz | Simon Mendelsohn
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP

This paper analyzes the behavior of stack-augmented recurrent neural network (RNN) models. Due to the architectural similarity between stack RNNs and pushdown transducers, we train stack RNN models on a number of tasks, including string reversal, context-free language modelling, and cumulative XOR evaluation. Examining the behavior of our networks, we show that stack-augmented RNNs can discover intuitive stack-based strategies for solving our tasks. However, stack RNNs are more difficult to train than classical architectures such as LSTMs. Rather than employ stack-based strategies, more complex networks often find approximate solutions by using the stack as unstructured memory.

2011

pdf bib
Effects of Meaning-Preserving Corrections on Language Learning
Dana Angluin | Leonor Becerra-Bonache
Proceedings of the Fifteenth Conference on Computational Natural Language Learning

2009

pdf bib
Experiments Using OSTIA for a Language Production Task
Dana Angluin | Leonor Becerra-Bonache
Proceedings of the EACL 2009 Workshop on Computational Linguistic Aspects of Grammatical Inference