Terry Regier


2020

pdf
Semantic categories of artifacts and animals reflect efficient coding
Noga Zaslavsky | Terry Regier | Naftali Tishby | Charles Kemp
Proceedings of the Society for Computation in Linguistics 2020

2018

pdf
Probing sentence embeddings for structure-dependent tense
Geoff Bacon | Terry Regier
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP

Learning universal sentence representations which accurately model sentential semantic content is a current goal of natural language processing research. A prominent and successful approach is to train recurrent neural networks (RNNs) to encode sentences into fixed length vectors. Many core linguistic phenomena that one would like to model in universal sentence representations depend on syntactic structure. Despite the fact that RNNs do not have explicit syntactic structural representations, there is some evidence that RNNs can approximate such structure-dependent phenomena under certain conditions, in addition to their widespread success in practical tasks. In this work, we assess RNNs’ ability to learn the structure-dependent phenomenon of main clause tense.

1991

pdf
Learning Perceptually-Grounded Semantics in the L₀ Project
Terry Regier
29th Annual Meeting of the Association for Computational Linguistics