Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

Pradeep Dasigi, Waleed Ammar, Chris Dyer, Eduard Hovy


Abstract
Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language. Instead, we embed semantic concepts (or synsets) as defined in WordNet and represent a word token in a particular context by estimating a distribution over relevant semantic concepts. We use the new, context-sensitive embeddings in a model for predicting prepositional phrase (PP) attachments and jointly learn the concept embeddings and model parameters. We show that using context-sensitive embeddings improves the accuracy of the PP attachment model by 5.4% absolute points, which amounts to a 34.4% relative reduction in errors.
Anthology ID:
P17-1191
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2089–2098
Language:
URL:
https://aclanthology.org/P17-1191
DOI:
10.18653/v1/P17-1191
Bibkey:
Cite (ACL):
Pradeep Dasigi, Waleed Ammar, Chris Dyer, and Eduard Hovy. 2017. Ontology-Aware Token Embeddings for Prepositional Phrase Attachment. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2089–2098, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Ontology-Aware Token Embeddings for Prepositional Phrase Attachment (Dasigi et al., ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/P17-1191.pdf
Code
 pdasigi/onto-lstm