Ultra-Fine Entity Typing

Eunsol Choi, Omer Levy, Yejin Choi, Luke Zettlemoyer


Abstract
We introduce a new entity typing task: given a sentence with an entity mention, the goal is to predict a set of free-form phrases (e.g. skyscraper, songwriter, or criminal) that describe appropriate types for the target entity. This formulation allows us to use a new type of distant supervision at large scale: head words, which indicate the type of the noun phrases they appear in. We show that these ultra-fine types can be crowd-sourced, and introduce new evaluation sets that are much more diverse and fine-grained than existing benchmarks. We present a model that can predict ultra-fine types, and is trained using a multitask objective that pools our new head-word supervision with prior supervision from entity linking. Experimental results demonstrate that our model is effective in predicting entity types at varying granularity; it achieves state of the art performance on an existing fine-grained entity typing benchmark, and sets baselines for our newly-introduced datasets.
Anthology ID:
P18-1009
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
87–96
Language:
URL:
https://aclanthology.org/P18-1009
DOI:
10.18653/v1/P18-1009
Bibkey:
Cite (ACL):
Eunsol Choi, Omer Levy, Yejin Choi, and Luke Zettlemoyer. 2018. Ultra-Fine Entity Typing. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 87–96, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Ultra-Fine Entity Typing (Choi et al., ACL 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/P18-1009.pdf
Presentation:
 P18-1009.Presentation.pdf
Video:
 https://vimeo.com/285807855
Data
Open EntityFIGEROntoNotes 5.0