Abstract
Building meaningful representations of noun compounds is not trivial since many of them scarcely appear in the corpus. To that end, composition functions approximate the distributional representation of a noun compound by combining its constituent distributional vectors. In the more general case, phrase embeddings have been trained by minimizing the distance between the vectors representing paraphrases. We compare various types of noun compound representations, including distributional, compositional, and paraphrase-based representations, through a series of tasks and analyses, and with an extensive number of underlying word embeddings. We find that indeed, in most cases, composition functions produce higher quality representations than distributional ones, and they improve with computational power. No single function performs best in all scenarios, suggesting that a joint training objective may produce improved representations.- Anthology ID:
- W19-5111
- Volume:
- Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019)
- Month:
- August
- Year:
- 2019
- Address:
- Florence, Italy
- Venue:
- MWE
- SIG:
- SIGLEX
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 92–103
- Language:
- URL:
- https://aclanthology.org/W19-5111
- DOI:
- 10.18653/v1/W19-5111
- Cite (ACL):
- Vered Shwartz. 2019. A Systematic Comparison of English Noun Compound Representations. In Proceedings of the Joint Workshop on Multiword Expressions and WordNet (MWE-WN 2019), pages 92–103, Florence, Italy. Association for Computational Linguistics.
- Cite (Informal):
- A Systematic Comparison of English Noun Compound Representations (Shwartz, MWE 2019)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/W19-5111.pdf
- Code
- vered1986/NC_Embeddings