Geo-Seq2seq: Twitter User Geolocation on Noisy Data through Sequence to Sequence Learning

Jingyu Zhang, Alexandra DeLucia, Chenyu Zhang, Mark Dredze


Abstract
Location information can support social media analyses by providing geographic context. Some of the most accurate and popular Twitter geolocation systems rely on rule-based methods that examine the user-provided profile location, which fail to handle informal or noisy location names. We propose Geo-Seq2seq, a sequence-to-sequence (seq2seq) model for Twitter user geolocation that rewrites noisy, multilingual user-provided location strings into structured English location names. We train our system on tens of millions of multilingual location string and geotagged-tweet pairs. Compared to leading methods, our model vastly increases coverage (i.e., the number of users we can geolocate) while achieving comparable or superior accuracy. Our error analysis reveals that constrained decoding helps the model produce valid locations according to a location database. Finally, we measure biases across language, country of origin, and time to evaluate fairness, and find that while our model can generalize well to unseen temporal data, performance does vary by language and country.
Anthology ID:
2023.findings-acl.294
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4778–4794
Language:
URL:
https://aclanthology.org/2023.findings-acl.294
DOI:
10.18653/v1/2023.findings-acl.294
Bibkey:
Cite (ACL):
Jingyu Zhang, Alexandra DeLucia, Chenyu Zhang, and Mark Dredze. 2023. Geo-Seq2seq: Twitter User Geolocation on Noisy Data through Sequence to Sequence Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 4778–4794, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Geo-Seq2seq: Twitter User Geolocation on Noisy Data through Sequence to Sequence Learning (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.findings-acl.294.pdf