Neural Coreference Resolution for Arabic

Abdulrahman Aloraini, Juntao Yu, Massimo Poesio


Abstract
No neural coreference resolver for Arabic exists, in fact we are not aware of any learning-based coreference resolver for Arabic since (Björkelund and Kuhn, 2014). In this paper, we introduce a coreference resolution system for Arabic based on Lee et al’s end-to-end architecture combined with the Arabic version of bert and an external mention detector. As far as we know, this is the first neural coreference resolution system aimed specifically to Arabic, and it substantially outperforms the existing state-of-the-art on OntoNotes 5.0 with a gain of 15.2 points conll F1. We also discuss the current limitations of the task for Arabic and possible approaches that can tackle these challenges.
Anthology ID:
2020.crac-1.11
Volume:
Proceedings of the Third Workshop on Computational Models of Reference, Anaphora and Coreference
Month:
December
Year:
2020
Address:
Barcelona, Spain (online)
Editors:
Maciej Ogrodniczuk, Vincent Ng, Yulia Grishina, Sameer Pradhan
Venue:
CRAC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
99–110
Language:
URL:
https://aclanthology.org/2020.crac-1.11
DOI:
Bibkey:
Cite (ACL):
Abdulrahman Aloraini, Juntao Yu, and Massimo Poesio. 2020. Neural Coreference Resolution for Arabic. In Proceedings of the Third Workshop on Computational Models of Reference, Anaphora and Coreference, pages 99–110, Barcelona, Spain (online). Association for Computational Linguistics.
Cite (Informal):
Neural Coreference Resolution for Arabic (Aloraini et al., CRAC 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.crac-1.11.pdf
Code
 juntaoy/aracoref
Data
CoNLL-2012