Jinxi Liu
2022
Generative Entity Typing with Curriculum Learning
Siyu Yuan
|
Deqing Yang
|
Jiaqing Liang
|
Zhixu Li
|
Jinxi Liu
|
Jingyue Huang
|
Yanghua Xiao
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Entity typing aims to assign types to the entity mentions in given texts. The traditional classification-based entity typing paradigm has two unignorable drawbacks: 1) it fails to assign an entity to the types beyond the predefined type set, and 2) it can hardly handle few-shot and zero-shot situations where many long-tail types only have few or even no training instances. To overcome these drawbacks, we propose a novel generative entity typing (GET) paradigm: given a text with an entity mention, the multiple types for the role that the entity plays in the text are generated with a pre-trained language model (PLM). However, PLMs tend to generate coarse-grained types after fine-tuning upon the entity typing dataset. In addition, only the heterogeneous training data consisting of a small portion of human-annotated data and a large portion of auto-generated but low-quality data are provided for model training. To tackle these problems, we employ curriculum learning (CL) to train our GET model on heterogeneous data, where the curriculum could be self-adjusted with the self-paced learning according to its comprehension of the type granularity and data heterogeneity. Our extensive experiments upon the datasets of different languages and downstream tasks justify the superiority of our GET model over the state-of-the-art entity typing models. The code has been released on https://github.com/siyuyuan/GET.
Search
Co-authors
- Siyu Yuan 1
- Deqing Yang 1
- Jiaqing Liang 1
- Zhixu Li 1
- Jingyue Huang 1
- show all...