SpaBERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation

Zekun Li, Jina Kim, Yao-Yi Chiang, Muhao Chen


Abstract
Named geographic entities (geo-entities for short) are the building blocks of many geographic datasets. Characterizing geo-entities is integral to various application domains, such as geo-intelligence and map comprehension, while a key challenge is to capture the spatial-varying context of an entity. We hypothesize that we shall know the characteristics of a geo-entity by its surrounding entities, similar to knowing word meanings by their linguistic context. Accordingly, we propose a novel spatial language model, SpaBERT, which provides a general-purpose geo-entity representation based on neighboring entities in geospatial data. SpaBERT extends BERT to capture linearized spatial context, while incorporating a spatial coordinate embedding mechanism to preserve spatial relations of entities in the 2-dimensional space. SpaBERT is pretrained with masked language modeling and masked entity prediction tasks to learn spatial dependencies. We apply SpaBERT to two downstream tasks: geo-entity typing and geo-entity linking. Compared with the existing language models that do not use spatial context, SpaBERT shows significant performance improvement on both tasks. We also analyze the entity representation from SpaBERT in various settings and the effect of spatial coordinate embedding.
Anthology ID:
2022.findings-emnlp.200
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2757–2769
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.200
DOI:
10.18653/v1/2022.findings-emnlp.200
Bibkey:
Cite (ACL):
Zekun Li, Jina Kim, Yao-Yi Chiang, and Muhao Chen. 2022. SpaBERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2757–2769, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
SpaBERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation (Li et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.200.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2022.findings-emnlp.200.mp4