Comparison by Conversion: Reverse-Engineering UCCA from Syntax and Lexical Semantics

Daniel Hershcovich, Nathan Schneider, Dotan Dvir, Jakob Prange, Miryam de Lhoneux, Omri Abend


Abstract
Building robust natural language understanding systems will require a clear characterization of whether and how various linguistic meaning representations complement each other. To perform a systematic comparative analysis, we evaluate the mapping between meaning representations from different frameworks using two complementary methods: (i) a rule-based converter, and (ii) a supervised delexicalized parser that parses to one framework using only information from the other as features. We apply these methods to convert the STREUSLE corpus (with syntactic and lexical semantic annotations) to UCCA (a graph-structured full-sentence meaning representation). Both methods yield surprisingly accurate target representations, close to fully supervised UCCA parser quality—indicating that UCCA annotations are partially redundant with STREUSLE annotations. Despite this substantial convergence between frameworks, we find several important areas of divergence.
Anthology ID:
2020.coling-main.264
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2947–2966
Language:
URL:
https://aclanthology.org/2020.coling-main.264
DOI:
10.18653/v1/2020.coling-main.264
Bibkey:
Cite (ACL):
Daniel Hershcovich, Nathan Schneider, Dotan Dvir, Jakob Prange, Miryam de Lhoneux, and Omri Abend. 2020. Comparison by Conversion: Reverse-Engineering UCCA from Syntax and Lexical Semantics. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2947–2966, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Comparison by Conversion: Reverse-Engineering UCCA from Syntax and Lexical Semantics (Hershcovich et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.coling-main.264.pdf
Code
 nert-nlp/streusle +  additional community code
Data
Universal Dependencies