Character-Based Neural Networks for Sentence Pair Modeling

Wuwei Lan, Wei Xu


Abstract
Sentence pair modeling is critical for many NLP tasks, such as paraphrase identification, semantic textual similarity, and natural language inference. Most state-of-the-art neural models for these tasks rely on pretrained word embedding and compose sentence-level semantics in varied ways; however, few works have attempted to verify whether we really need pretrained embeddings in these tasks. In this paper, we study how effective subword-level (character and character n-gram) representations are in sentence pair modeling. Though it is well-known that subword models are effective in tasks with single sentence input, including language modeling and machine translation, they have not been systematically studied in sentence pair modeling tasks where the semantic and string similarities between texts matter. Our experiments show that subword models without any pretrained word embedding can achieve new state-of-the-art results on two social media datasets and competitive results on news data for paraphrase identification.
Anthology ID:
N18-2025
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
157–163
Language:
URL:
https://aclanthology.org/N18-2025
DOI:
10.18653/v1/N18-2025
Bibkey:
Cite (ACL):
Wuwei Lan and Wei Xu. 2018. Character-Based Neural Networks for Sentence Pair Modeling. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 157–163, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Character-Based Neural Networks for Sentence Pair Modeling (Lan & Xu, NAACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/N18-2025.pdf