Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information

Massimo Nicosia, Alessandro Moschitti


Abstract
Tree kernels (TKs) and neural networks are two effective approaches for automatic feature engineering. In this paper, we combine them by modeling context word similarity in semantic TKs. This way, the latter can operate subtree matching by applying neural-based similarity on tree lexical nodes. We study how to learn representations for the words in context such that TKs can exploit more focused information. We found that neural embeddings produced by current methods do not provide a suitable contextual similarity. Thus, we define a new approach based on a Siamese Network, which produces word representations while learning a binary text similarity. We set the latter considering examples in the same category as similar. The experiments on question and sentiment classification show that our semantic TK highly improves previous results.
Anthology ID:
K17-1027
Volume:
Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017)
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Roger Levy, Lucia Specia
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
260–270
Language:
URL:
https://aclanthology.org/K17-1027
DOI:
10.18653/v1/K17-1027
Bibkey:
Cite (ACL):
Massimo Nicosia and Alessandro Moschitti. 2017. Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017), pages 260–270, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Learning Contextual Embeddings for Structural Semantic Similarity using Categorical Information (Nicosia & Moschitti, CoNLL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/K17-1027.pdf