MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models

Sheng Zhang, Kevin Duh, Benjamin Van Durme


Abstract
Cross-lingual information extraction is the task of distilling facts from foreign language (e.g. Chinese text) into representations in another language that is preferred by the user (e.g. English tuples). Conventional pipeline solutions decompose the task as machine translation followed by information extraction (or vice versa). We propose a joint solution with a neural sequence model, and show that it outperforms the pipeline in a cross-lingual open information extraction setting by 1-4 BLEU and 0.5-0.8 F1.
Anthology ID:
E17-2011
Volume:
Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers
Month:
April
Year:
2017
Address:
Valencia, Spain
Editors:
Mirella Lapata, Phil Blunsom, Alexander Koller
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
64–70
Language:
URL:
https://aclanthology.org/E17-2011
DOI:
Bibkey:
Cite (ACL):
Sheng Zhang, Kevin Duh, and Benjamin Van Durme. 2017. MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics: Volume 2, Short Papers, pages 64–70, Valencia, Spain. Association for Computational Linguistics.
Cite (Informal):
MT/IE: Cross-lingual Open Information Extraction with Neural Sequence-to-Sequence Models (Zhang et al., EACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/E17-2011.pdf