Towards a Broad Coverage Named Entity Resource: A Data-Efficient Approach for Many Diverse Languages

Silvia Severini, Ayyoob ImaniGooghari, Philipp Dufter, Hinrich Schütze


Abstract
Parallel corpora are ideal for extracting a multilingual named entity (MNE) resource, i.e., a dataset of names translated into multiple languages. Prior work on extracting MNE datasets from parallel corpora required resources such as large monolingual corpora or word aligners that are unavailable or perform poorly for underresourced languages. We present CLC-BN, a new method for creating an MNE resource, and apply it to the Parallel Bible Corpus, a corpus of more than 1000 languages. CLC-BN learns a neural transliteration model from parallel-corpus statistics, without requiring any other bilingual resources, word aligners, or seed data. Experimental results show that CLC-BN clearly outperforms prior work. We release an MNE resource for 1340 languages and demonstrate its effectiveness in two downstream tasks: knowledge graph augmentation and bilingual lexicon induction.
Anthology ID:
2022.lrec-1.417
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
3923–3933
Language:
URL:
https://aclanthology.org/2022.lrec-1.417
DOI:
Bibkey:
Cite (ACL):
Silvia Severini, Ayyoob ImaniGooghari, Philipp Dufter, and Hinrich Schütze. 2022. Towards a Broad Coverage Named Entity Resource: A Data-Efficient Approach for Many Diverse Languages. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 3923–3933, Marseille, France. European Language Resources Association.
Cite (Informal):
Towards a Broad Coverage Named Entity Resource: A Data-Efficient Approach for Many Diverse Languages (Severini et al., LREC 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2022.lrec-1.417.pdf