Improving and Simplifying Template-Based Named Entity Recognition

Murali Kondragunta, Olatz Perez-de-Viñaspre, Maite Oronoz


Abstract
With the rise in larger language models, researchers started exploiting them by pivoting the downstream tasks as language modeling tasks using prompts. In this work, we convert the Named Entity Recognition task into a seq2seq task by generating the synthetic sentences using templates. Our main contribution is the conversion framework which provides faster inference. In addition, we test our method’s performance in resource-rich, low resource and domain transfer settings. Results show that our method achieves comparable results in the resource-rich setting and outperforms the current seq2seq paradigm state-of-the-art approach in few-shot settings. Through the experiments, we observed that the negative examples play an important role in model’s performance. We applied our approach over BART and T5-base models, and we notice that the T5 architecture aligns better with our task. The work is performed on the datasets in English language.
Anthology ID:
2023.eacl-srw.8
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Elisa Bassignana, Matthias Lindemann, Alban Petit
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
79–86
Language:
URL:
https://aclanthology.org/2023.eacl-srw.8
DOI:
10.18653/v1/2023.eacl-srw.8
Bibkey:
Cite (ACL):
Murali Kondragunta, Olatz Perez-de-Viñaspre, and Maite Oronoz. 2023. Improving and Simplifying Template-Based Named Entity Recognition. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 79–86, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Improving and Simplifying Template-Based Named Entity Recognition (Kondragunta et al., EACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2023.eacl-srw.8.pdf
Video:
 https://preview.aclanthology.org/naacl24-info/2023.eacl-srw.8.mp4