Abstract
State-of-the-art networks that model relations between two pieces of text often use complex architectures and attention. In this paper, instead of focusing on architecture engineering, we take advantage of small amounts of labelled data that model semantic phenomena in text to encode matching features directly in the word representations. This greatly boosts the accuracy of our reference network, while keeping the model simple and fast to train. Our approach also beats a tree kernel model that uses similar input encodings, and neural models which use advanced attention and compare-aggregate mechanisms.- Anthology ID:
- D18-1133
- Volume:
- Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
- Month:
- October-November
- Year:
- 2018
- Address:
- Brussels, Belgium
- Venue:
- EMNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1070–1076
- Language:
- URL:
- https://aclanthology.org/D18-1133
- DOI:
- 10.18653/v1/D18-1133
- Cite (ACL):
- Massimo Nicosia and Alessandro Moschitti. 2018. Semantic Linking in Convolutional Neural Networks for Answer Sentence Selection. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1070–1076, Brussels, Belgium. Association for Computational Linguistics.
- Cite (Informal):
- Semantic Linking in Convolutional Neural Networks for Answer Sentence Selection (Nicosia & Moschitti, EMNLP 2018)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/D18-1133.pdf
- Data
- TrecQA, WikiQA