Fine-grained Entity Typing without Knowledge Base
Jing Qian, Yibin Liu, Lemao Liu, Yangming Li, Haiyun Jiang, Haisong Zhang, Shuming Shi
Abstract
Existing work on Fine-grained Entity Typing (FET) typically trains automatic models on the datasets obtained by using Knowledge Bases (KB) as distant supervision. However, the reliance on KB means this training setting can be hampered by the lack of or the incompleteness of the KB. To alleviate this limitation, we propose a novel setting for training FET models: FET without accessing any knowledge base. Under this setting, we propose a two-step framework to train FET models. In the first step, we automatically create pseudo data with fine-grained labels from a large unlabeled dataset. Then a neural network model is trained based on the pseudo data, either in an unsupervised way or using self-training under the weak guidance from a coarse-grained Named Entity Recognition (NER) model. Experimental results show that our method achieves competitive performance with respect to the models trained on the original KB-supervised datasets.- Anthology ID:
- 2021.emnlp-main.431
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 5309–5319
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.431
- DOI:
- 10.18653/v1/2021.emnlp-main.431
- Cite (ACL):
- Jing Qian, Yibin Liu, Lemao Liu, Yangming Li, Haiyun Jiang, Haisong Zhang, and Shuming Shi. 2021. Fine-grained Entity Typing without Knowledge Base. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5309–5319, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Fine-grained Entity Typing without Knowledge Base (Qian et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2021.emnlp-main.431.pdf
- Code
- lemaoliu/fet-data
- Data
- FIGER