Joint learning of constraint weights and gradient inputs in Gradient Symbolic Computation with constrained optimization

Max Nelson


Abstract
This paper proposes a method for the joint optimization of constraint weights and symbol activations within the Gradient Symbolic Computation (GSC) framework. The set of grammars representable in GSC is proven to be a subset of those representable with lexically-scaled faithfulness constraints. This fact is then used to recast the problem of learning constraint weights and symbol activations in GSC as a quadratically-constrained version of learning lexically-scaled faithfulness grammars. This results in an optimization problem that can be solved using Sequential Quadratic Programming.
Anthology ID:
2020.sigmorphon-1.27
Volume:
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology
Month:
July
Year:
2020
Address:
Online
Editors:
Garrett Nicolai, Kyle Gorman, Ryan Cotterell
Venue:
SIGMORPHON
SIG:
SIGMORPHON
Publisher:
Association for Computational Linguistics
Note:
Pages:
224–232
Language:
URL:
https://aclanthology.org/2020.sigmorphon-1.27
DOI:
10.18653/v1/2020.sigmorphon-1.27
Bibkey:
Cite (ACL):
Max Nelson. 2020. Joint learning of constraint weights and gradient inputs in Gradient Symbolic Computation with constrained optimization. In Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology, pages 224–232, Online. Association for Computational Linguistics.
Cite (Informal):
Joint learning of constraint weights and gradient inputs in Gradient Symbolic Computation with constrained optimization (Nelson, SIGMORPHON 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.sigmorphon-1.27.pdf
Video:
 http://slideslive.com/38929880