An Empirical Study on Multiple Information Sources for Zero-Shot Fine-Grained Entity Typing

Yi Chen, Haiyun Jiang, Lemao Liu, Shuming Shi, Chuang Fan, Min Yang, Ruifeng Xu


Abstract
Auxiliary information from multiple sources has been demonstrated to be effective in zero-shot fine-grained entity typing (ZFET). However, there lacks a comprehensive understanding about how to make better use of the existing information sources and how they affect the performance of ZFET. In this paper, we empirically study three kinds of auxiliary information: context consistency, type hierarchy and background knowledge (e.g., prototypes and descriptions) of types, and propose a multi-source fusion model (MSF) targeting these sources. The performance obtains up to 11.42% and 22.84% absolute gains over state-of-the-art baselines on BBN and Wiki respectively with regard to macro F1 scores. More importantly, we further discuss the characteristics, merits and demerits of each information source and provide an intuitive understanding of the complementarity among them.
Anthology ID:
2021.emnlp-main.210
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2668–2678
Language:
URL:
https://aclanthology.org/2021.emnlp-main.210
DOI:
10.18653/v1/2021.emnlp-main.210
Bibkey:
Cite (ACL):
Yi Chen, Haiyun Jiang, Lemao Liu, Shuming Shi, Chuang Fan, Min Yang, and Ruifeng Xu. 2021. An Empirical Study on Multiple Information Sources for Zero-Shot Fine-Grained Entity Typing. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2668–2678, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
An Empirical Study on Multiple Information Sources for Zero-Shot Fine-Grained Entity Typing (Chen et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2021.emnlp-main.210.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2021.emnlp-main.210.mp4