Abstract
We propose a Cross-lingual Encoder-Decoder model that simultaneously translates and generates sentences with Semantic Role Labeling annotations in a resource-poor target language. Unlike annotation projection techniques, our model does not need parallel data during inference time. Our approach can be applied in monolingual, multilingual and cross-lingual settings and is able to produce dependency-based and span-based SRL annotations. We benchmark the labeling performance of our model in different monolingual and multilingual settings using well-known SRL datasets. We then train our model in a cross-lingual setting to generate new SRL labeled data. Finally, we measure the effectiveness of our method by using the generated data to augment the training basis for resource-poor languages and perform manual evaluation to show that it produces high-quality sentences and assigns accurate semantic role annotations. Our proposed architecture offers a flexible method for leveraging SRL data in multiple languages.- Anthology ID:
- D19-1056
- Volume:
- Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
- Month:
- November
- Year:
- 2019
- Address:
- Hong Kong, China
- Venues:
- EMNLP | IJCNLP
- SIG:
- SIGDAT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 603–615
- Language:
- URL:
- https://aclanthology.org/D19-1056
- DOI:
- 10.18653/v1/D19-1056
- Cite (ACL):
- Angel Daza and Anette Frank. 2019. Translate and Label! An Encoder-Decoder Approach for Cross-lingual Semantic Role Labeling. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 603–615, Hong Kong, China. Association for Computational Linguistics.
- Cite (Informal):
- Translate and Label! An Encoder-Decoder Approach for Cross-lingual Semantic Role Labeling (Daza & Frank, EMNLP-IJCNLP 2019)
- PDF:
- https://preview.aclanthology.org/remove-xml-comments/D19-1056.pdf
- Code
- Heidelberg-NLP/SRL-S2S