Context-Aware Prediction of Derivational Word-forms

Ekaterina Vylomova, Ryan Cotterell, Timothy Baldwin, Trevor Cohn


Abstract
Derivational morphology is a fundamental and complex characteristic of language. In this paper we propose a new task of predicting the derivational form of a given base-form lemma that is appropriate for a given context. We present an encoder-decoder style neural network to produce a derived form character-by-character, based on its corresponding character-level representation of the base form and the context. We demonstrate that our model is able to generate valid context-sensitive derivations from known base forms, but is less accurate under lexicon agnostic setting.
Anthology ID:
E17-2019
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Mirella Lapata, Phil Blunsom, Alexander Koller
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
118–124
Language:
URL:
https://aclanthology.org/E17-2019
DOI:
Bibkey:
Cite (ACL):
Ekaterina Vylomova, Ryan Cotterell, Timothy Baldwin, and Trevor Cohn. 2017. Context-Aware Prediction of Derivational Word-forms. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pages 118–124, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
Context-Aware Prediction of Derivational Word-forms (Vylomova et al., EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/E17-2019.pdf
Code
 ivri/dmorph