Performance Analysis of Arabic Pre-trained Models on Named Entity Recognition Task

Abdelhalim Hafedh Dahou, Mohamed Amine Cheragui, Ahmed Abdelali


Abstract
Named Entity Recognition (NER) is a crucial task within natural language processing (NLP) that entails the identification and classification of entities, such as person, organization and location. This study delves into NER specifically in the Arabic language, focusing on the Algerian dialect. While previous research in NER has primarily concentrated on Modern Standard Arabic (MSA), the advent of social media has prompted a need to address the variations found in different Arabic dialects. Moreover, given the notable achievements of Large-scale pre-trained models (PTMs) based on the BERT architecture, this paper aims to evaluate Arabic pre-trained models using an Algerian dataset that covers different domains and writing styles. Additionally, an error analysis is conducted to identify PTMs’ limitations, and an investigation is carried out to assess the performance of trained MSA models on the Algerian dialect. The experimental results and subsequent analysis shed light on the complexities of NER in Arabic, offering valuable insights for future research endeavors.
Anthology ID:
2023.ranlp-1.51
Volume:
Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing
Month:
September
Year:
2023
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
458–467
Language:
URL:
https://aclanthology.org/2023.ranlp-1.51
DOI:
Bibkey:
Cite (ACL):
Abdelhalim Hafedh Dahou, Mohamed Amine Cheragui, and Ahmed Abdelali. 2023. Performance Analysis of Arabic Pre-trained Models on Named Entity Recognition Task. In Proceedings of the 14th International Conference on Recent Advances in Natural Language Processing, pages 458–467, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Performance Analysis of Arabic Pre-trained Models on Named Entity Recognition Task (Dahou et al., RANLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2023.ranlp-1.51.pdf