D-Coref: A Fast and Lightweight Coreference Resolution Model using DistilBERT

Chanchal Suman, Jeetu Kumar, Sriparna Saha, Pushpak Bhattacharyya


Abstract
Smart devices are often deployed in some edge-devices, which require quality solutions in limited amount of memory usage. In most of the user-interaction based smart devices, coreference resolution is often required. Keeping this in view, we have developed a fast and lightweight coreference resolution model which meets the minimum memory requirement and converges faster. In order to generate the embeddings for solving the task of coreference resolution, DistilBERT, a light weight BERT module is utilized. DistilBERT consumes less memory (only 60% of memory in comparison to BERT-based heavy model) and it is suitable for deployment in edge devices. DistilBERT embedding helps in 60% faster convergence with an accuracy compromise of 2.59%, and 6.49% with respect to its base model and current state-of-the-art, respectively.
Anthology ID:
2020.icon-main.43
Volume:
Proceedings of the 17th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2020
Address:
Indian Institute of Technology Patna, Patna, India
Editors:
Pushpak Bhattacharyya, Dipti Misra Sharma, Rajeev Sangal
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
323–328
Language:
URL:
https://aclanthology.org/2020.icon-main.43
DOI:
Bibkey:
Cite (ACL):
Chanchal Suman, Jeetu Kumar, Sriparna Saha, and Pushpak Bhattacharyya. 2020. D-Coref: A Fast and Lightweight Coreference Resolution Model using DistilBERT. In Proceedings of the 17th International Conference on Natural Language Processing (ICON), pages 323–328, Indian Institute of Technology Patna, Patna, India. NLP Association of India (NLPAI).
Cite (Informal):
D-Coref: A Fast and Lightweight Coreference Resolution Model using DistilBERT (Suman et al., ICON 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2020.icon-main.43.pdf