@inproceedings{alshammari-2024-bangor,
    title = "{B}angor {U}niversity at {W}ojood{NER} 2024: Advancing {A}rabic Named Entity Recognition with {CAM}e{LBERT}-Mix",
    author = "Alshammari, Norah",
    editor = "Habash, Nizar  and
      Bouamor, Houda  and
      Eskander, Ramy  and
      Tomeh, Nadi  and
      Abu Farha, Ibrahim  and
      Abdelali, Ahmed  and
      Touileb, Samia  and
      Hamed, Injy  and
      Onaizan, Yaser  and
      Alhafni, Bashar  and
      Antoun, Wissam  and
      Khalifa, Salam  and
      Haddad, Hatem  and
      Zitouni, Imed  and
      AlKhamissi, Badr  and
      Almatham, Rawan  and
      Mrini, Khalil",
    booktitle = "Proceedings of the Second Arabic Natural Language Processing Conference",
    month = aug,
    year = "2024",
    address = "Bangkok, Thailand",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2024.arabicnlp-1.105/",
    doi = "10.18653/v1/2024.arabicnlp-1.105",
    pages = "880--884",
    abstract = "This paper describes the approach and results of Bangor University{'}s participation in the WojoodNER 2024 shared task, specifically for Subtask-1: Closed-Track Flat Fine-Grain NER. We present a system utilizing a transformer-based model called bert-base-arabic-camelbert-mix, fine-tuned on the Wojood-Fine corpus. A key enhancement to our approach involves adding a linear layer on top of the bert-base-arabic-camelbert-mix to classify each token into one of 51 different entity types and subtypes, as well as the `O' label for non-entity tokens. This linear layer effectively maps the contextualized embeddings produced by BERT to the desired output labels, addressing the complex challenges of fine-grained Arabic NER. The system achieved competitive results in precision, recall, and F1 scores, thereby contributing significant insights into the application of transformers in Arabic NER tasks."
}Markdown (Informal)
[Bangor University at WojoodNER 2024: Advancing Arabic Named Entity Recognition with CAMeLBERT-Mix](https://preview.aclanthology.org/ingest-emnlp/2024.arabicnlp-1.105/) (Alshammari, ArabicNLP 2024)
ACL