Brandon Prickett


2022

pdf
Proceedings of the Society for Computation in Linguistics 2022
Allyson Ettinger | Tim Hunter | Brandon Prickett
Proceedings of the Society for Computation in Linguistics 2022

pdf
Learning Stress Patterns with a Sequence-to-Sequence Neural Network
Brandon Prickett | Joe Pater
Proceedings of the Society for Computation in Linguistics 2022

2021

pdf
Proceedings of the Society for Computation in Linguistics 2021
Allyson Ettinger | Ellie Pavlick | Brandon Prickett
Proceedings of the Society for Computation in Linguistics 2021

2020

pdf
Probing RNN Encoder-Decoder Generalization of Subregular Functions using Reduplication
Max Nelson | Hossep Dolatian | Jonathan Rawski | Brandon Prickett
Proceedings of the Society for Computation in Linguistics 2020

2019

pdf
Learning Exceptionality and Variation with Lexically Scaled MaxEnt
Coral Hughto | Andrew Lamont | Brandon Prickett | Gaja Jarosz
Proceedings of the Society for Computation in Linguistics (SCiL) 2019

2018

pdf
Seq2Seq Models with Dropout can Learn Generalizable Reduplication
Brandon Prickett | Aaron Traylor | Joe Pater
Proceedings of the Fifteenth Workshop on Computational Research in Phonetics, Phonology, and Morphology

Natural language reduplication can pose a challenge to neural models of language, and has been argued to require variables (Marcus et al., 1999). Sequence-to-sequence neural networks have been shown to perform well at a number of other morphological tasks (Cotterell et al., 2016), and produce results that highly correlate with human behavior (Kirov, 2017; Kirov & Cotterell, 2018) but do not include any explicit variables in their architecture. We find that they can learn a reduplicative pattern that generalizes to novel segments if they are trained with dropout (Srivastava et al., 2014). We argue that this matches the scope of generalization observed in human reduplication.