Probing for Predicate Argument Structures in Pretrained Language Models

Simone Conia, Roberto Navigli


Abstract
Thanks to the effectiveness and wide availability of modern pretrained language models (PLMs), recently proposed approaches have achieved remarkable results in dependency- and span-based, multilingual and cross-lingual Semantic Role Labeling (SRL). These results have prompted researchers to investigate the inner workings of modern PLMs with the aim of understanding how, where, and to what extent they encode information about SRL. In this paper, we follow this line of research and probe for predicate argument structures in PLMs. Our study shows that PLMs do encode semantic structures directly into the contextualized representation of a predicate, and also provides insights into the correlation between predicate senses and their structures, the degree of transferability between nominal and verbal structures, and how such structures are encoded across languages. Finally, we look at the practical implications of such insights and demonstrate the benefits of embedding predicate argument structure information into an SRL model.
Anthology ID:
2022.acl-long.316
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4622–4632
Language:
URL:
https://aclanthology.org/2022.acl-long.316
DOI:
10.18653/v1/2022.acl-long.316
Bibkey:
Cite (ACL):
Simone Conia and Roberto Navigli. 2022. Probing for Predicate Argument Structures in Pretrained Language Models. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 4622–4632, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Probing for Predicate Argument Structures in Pretrained Language Models (Conia & Navigli, ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.316.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2022.acl-long.316.mp4
Code
 sapienzanlp/srl-pas-probing