Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF

Anton Emelyanov, Ekaterina Artemova

[How to correct problems with metadata yourself]


Abstract
In this paper we tackle multilingual named entity recognition task. We use the BERT Language Model as embeddings with bidirectional recurrent network, attention, and NCRF on the top. We apply multilingual BERT only as embedder without any fine-tuning. We test out model on the dataset of the BSNLP shared task, which consists of texts in Bulgarian, Czech, Polish and Russian languages.
Anthology ID:
W19-3713
Volume:
Proceedings of the 7th Workshop on Balto-Slavic Natural Language Processing
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Tomaž Erjavec, Michał Marcińczuk, Preslav Nakov, Jakub Piskorski, Lidia Pivovarova, Jan Šnajder, Josef Steinberger, Roman Yangarber
Venue:
BSNLP
SIG:
SIGSLAV
Publisher:
Association for Computational Linguistics
Note:
Pages:
94–99
Language:
URL:
https://aclanthology.org/W19-3713
DOI:
10.18653/v1/W19-3713
Bibkey:
Cite (ACL):
Anton Emelyanov and Ekaterina Artemova. 2019. Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF. In Proceedings of the 7th Workshop on Balto-Slavic Natural Language Processing, pages 94–99, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF (Emelyanov & Artemova, BSNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/W19-3713.pdf
Code
 additional community code