Attending to Characters in Neural Sequence Labeling Models

Marek Rei, Gamal Crichton, Sampo Pyysalo

[How to correct problems with metadata yourself]


Abstract
Sequence labeling architectures use word embeddings for capturing similarity, but suffer when handling previously unseen or rare words. We investigate character-level extensions to such models and propose a novel architecture for combining alternative word representations. By using an attention mechanism, the model is able to dynamically decide how much information to use from a word- or character-level component. We evaluated different architectures on a range of sequence labeling datasets, and character-level extensions were found to improve performance on every benchmark. In addition, the proposed attention-based architecture delivered the best results even with a smaller number of trainable parameters.
Anthology ID:
C16-1030
Volume:
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Month:
December
Year:
2016
Address:
Osaka, Japan
Editors:
Yuji Matsumoto, Rashmi Prasad
Venue:
COLING
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
309–318
Language:
URL:
https://aclanthology.org/C16-1030
DOI:
Bibkey:
Cite (ACL):
Marek Rei, Gamal Crichton, and Sampo Pyysalo. 2016. Attending to Characters in Neural Sequence Labeling Models. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 309–318, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Attending to Characters in Neural Sequence Labeling Models (Rei et al., COLING 2016)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/C16-1030.pdf
Data
FCEPenn Treebank