Introducing Orthogonal Constraint in Structural Probes

Tomasz Limisiewicz, David Mareček


Abstract
With the recent success of pre-trained models in NLP, a significant focus was put on interpreting their representations. One of the most prominent approaches is structural probing (Hewitt and Manning, 2019), where a linear projection of word embeddings is performed in order to approximate the topology of dependency structures. In this work, we introduce a new type of structural probing, where the linear projection is decomposed into 1. iso-morphic space rotation; 2. linear scaling that identifies and scales the most relevant dimensions. In addition to syntactic dependency, we evaluate our method on two novel tasks (lexical hypernymy and position in a sentence). We jointly train the probes for multiple tasks and experimentally show that lexical and syntactic information is separated in the representations. Moreover, the orthogonal constraint makes the Structural Probes less vulnerable to memorization.
Anthology ID:
2021.acl-long.36
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
428–442
Language:
URL:
https://aclanthology.org/2021.acl-long.36
DOI:
10.18653/v1/2021.acl-long.36
Bibkey:
Cite (ACL):
Tomasz Limisiewicz and David Mareček. 2021. Introducing Orthogonal Constraint in Structural Probes. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 428–442, Online. Association for Computational Linguistics.
Cite (Informal):
Introducing Orthogonal Constraint in Structural Probes (Limisiewicz & Mareček, ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2021.acl-long.36.pdf
Video:
 https://preview.aclanthology.org/ingest-acl-2023-videos/2021.acl-long.36.mp4
Code
 Tom556/OrthogonalTransformerProbing
Data
English Web Treebank