Abstract
The problem of blend formation in generative linguistics is interesting in the context of neologism, their quick adoption in modern life and the creative generative process guiding their formation. Blend quality depends on multitude of factors with high degrees of uncertainty. In this work, we investigate if the modern neural network models can sufficiently capture and recognize the creative blend composition process. We propose recurrent neural network sequence-to-sequence models, that are evaluated on multiple blend datasets available in the literature. We propose an ensemble neural and hybrid model that outperforms most of the baselines and heuristic models upon evaluation on test data.- Anthology ID:
- I17-1058
- Volume:
- Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
- Month:
- November
- Year:
- 2017
- Address:
- Taipei, Taiwan
- Editors:
- Greg Kondrak, Taro Watanabe
- Venue:
- IJCNLP
- SIG:
- Publisher:
- Asian Federation of Natural Language Processing
- Note:
- Pages:
- 576–583
- Language:
- URL:
- https://aclanthology.org/I17-1058
- DOI:
- Cite (ACL):
- Kollol Das and Shaona Ghosh. 2017. Neuramanteau: A Neural Network Ensemble Model for Lexical Blends. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 576–583, Taipei, Taiwan. Asian Federation of Natural Language Processing.
- Cite (Informal):
- Neuramanteau: A Neural Network Ensemble Model for Lexical Blends (Das & Ghosh, IJCNLP 2017)
- PDF:
- https://preview.aclanthology.org/ml4al-ingestion/I17-1058.pdf