CogALex-VI Shared Task: Transrelation - A Robust Multilingual Language Model for Multilingual Relation Identification

Lennart Wachowiak, Christian Lang, Barbara Heinisch, Dagmar Gromann


Abstract
We describe our submission to the CogALex-VI shared task on the identification of multilingual paradigmatic relations building on XLM-RoBERTa (XLM-R), a robustly optimized and multilingual BERT model. In spite of several experiments with data augmentation, data addition and ensemble methods with a Siamese Triple Net, Translrelation, the XLM-R model with a linear classifier adapted to this specific task, performed best in testing and achieved the best results in the final evaluation of the shared task, even for a previously unseen language.
Anthology ID:
2020.cogalex-1.7
Volume:
Proceedings of the Workshop on the Cognitive Aspects of the Lexicon
Month:
December
Year:
2020
Address:
Online
Venue:
CogALex
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
59–64
Language:
URL:
https://aclanthology.org/2020.cogalex-1.7
DOI:
Bibkey:
Cite (ACL):
Lennart Wachowiak, Christian Lang, Barbara Heinisch, and Dagmar Gromann. 2020. CogALex-VI Shared Task: Transrelation - A Robust Multilingual Language Model for Multilingual Relation Identification. In Proceedings of the Workshop on the Cognitive Aspects of the Lexicon, pages 59–64, Online. Association for Computational Linguistics.
Cite (Informal):
CogALex-VI Shared Task: Transrelation - A Robust Multilingual Language Model for Multilingual Relation Identification (Wachowiak et al., CogALex 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.cogalex-1.7.pdf
Code
 Text2TCS/Transrelation