Max Nelson


2020

pdf bib
Joint learning of constraint weights and gradient inputs in Gradient Symbolic Computation with constrained optimization
Max Nelson
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology

This paper proposes a method for the joint optimization of constraint weights and symbol activations within the Gradient Symbolic Computation (GSC) framework. The set of grammars representable in GSC is proven to be a subset of those representable with lexically-scaled faithfulness constraints. This fact is then used to recast the problem of learning constraint weights and symbol activations in GSC as a quadratically-constrained version of learning lexically-scaled faithfulness grammars. This results in an optimization problem that can be solved using Sequential Quadratic Programming.

pdf bib
Probing RNN Encoder-Decoder Generalization of Subregular Functions using Reduplication
Max Nelson | Hossep Dolatian | Jonathan Rawski | Brandon Prickett
Proceedings of the Society for Computation in Linguistics 2020

pdf bib
Phonotactic learning with neural language models
Connor Mayer | Max Nelson
Proceedings of the Society for Computation in Linguistics 2020

2019

pdf bib
Proceedings of the Society for Computation in Linguistics (SCiL) 2019
Gaja Jarosz | Max Nelson | Brendan O’Connor | Joe Pater
Proceedings of the Society for Computation in Linguistics (SCiL) 2019

pdf bib
Segmentation and UR Acquisition with UR Constraints
Max Nelson
Proceedings of the Society for Computation in Linguistics (SCiL) 2019