Tackling Code-Switched NER: Participation of CMU

Parvathy Geetha, Khyathi Chandu, Alan W Black


Abstract
Named Entity Recognition plays a major role in several downstream applications in NLP. Though this task has been heavily studied in formal monolingual texts and also noisy texts like Twitter data, it is still an emerging task in code-switched (CS) content on social media. This paper describes our participation in the shared task of NER on code-switched data for Spanglish (Spanish + English) and Arabish (Arabic + English). In this paper we describe models that intuitively developed from the data for the shared task Named Entity Recognition on Code-switched Data. Owing to the sparse and non-linear relationships between words in Twitter data, we explored neural architectures that are capable of non-linearities fairly well. In specific, we trained character level models and word level models based on Bidirectional LSTMs (Bi-LSTMs) to perform sequential tagging. We train multiple models to identify nominal mentions and subsequently use this information to predict the labels of named entity in a sequence. Our best model is a character level model along with word level pre-trained multilingual embeddings that gave an F-score of 56.72 in Spanglish and a word level model that gave an F-score of 65.02 in Arabish on the test data.
Anthology ID:
W18-3217
Volume:
Proceedings of the Third Workshop on Computational Approaches to Linguistic Code-Switching
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Gustavo Aguilar, Fahad AlGhamdi, Victor Soto, Thamar Solorio, Mona Diab, Julia Hirschberg
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
126–131
Language:
URL:
https://aclanthology.org/W18-3217
DOI:
10.18653/v1/W18-3217
Bibkey:
Cite (ACL):
Parvathy Geetha, Khyathi Chandu, and Alan W Black. 2018. Tackling Code-Switched NER: Participation of CMU. In Proceedings of the Third Workshop on Computational Approaches to Linguistic Code-Switching, pages 126–131, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Tackling Code-Switched NER: Participation of CMU (Geetha et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/W18-3217.pdf