A Mixture-of-Experts Model for Learning Multi-Facet Entity Embeddings
Rana Alshaikh, Zied Bouraoui, Shelan Jeawak, Steven Schockaert
Abstract
Various methods have already been proposed for learning entity embeddings from text descriptions. Such embeddings are commonly used for inferring properties of entities, for recommendation and entity-oriented search, and for injecting background knowledge into neural architectures, among others. Entity embeddings essentially serve as a compact encoding of a similarity relation, but similarity is an inherently multi-faceted notion. By representing entities as single vectors, existing methods leave it to downstream applications to identify these different facets, and to select the most relevant ones. In this paper, we propose a model that instead learns several vectors for each entity, each of which intuitively captures a different aspect of the considered domain. We use a mixture-of-experts formulation to jointly learn these facet-specific embeddings. The individual entity embeddings are learned using a variant of the GloVe model, which has the advantage that we can easily identify which properties are modelled well in which of the learned embeddings. This is exploited by an associated gating network, which uses pre-trained word vectors to encourage the properties that are modelled by a given embedding to be semantically coherent, i.e. to encourage each of the individual embeddings to capture a meaningful facet.- Anthology ID:
- 2020.coling-main.449
- Volume:
- Proceedings of the 28th International Conference on Computational Linguistics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Donia Scott, Nuria Bel, Chengqing Zong
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 5124–5135
- Language:
- URL:
- https://aclanthology.org/2020.coling-main.449
- DOI:
- 10.18653/v1/2020.coling-main.449
- Cite (ACL):
- Rana Alshaikh, Zied Bouraoui, Shelan Jeawak, and Steven Schockaert. 2020. A Mixture-of-Experts Model for Learning Multi-Facet Entity Embeddings. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5124–5135, Barcelona, Spain (Online). International Committee on Computational Linguistics.
- Cite (Informal):
- A Mixture-of-Experts Model for Learning Multi-Facet Entity Embeddings (Alshaikh et al., COLING 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.449.pdf
- Code
- rana-alshaikh/moeglove