On Neurons Invariant to Sentence Structural Changes in Neural Machine Translation

Gal Patel, Leshem Choshen, Omri Abend


Abstract
We present a methodology that explores how sentence structure is reflected in neural representations of machine translation systems. We demonstrate our model-agnostic approach with the Transformer English-German translation model. We analyze neuron-level correlation of activations between paraphrases while discussing the methodology challenges and the need for confound analysis to isolate the effects of shallow cues. We find that similarity between activation patterns can be mostly accounted for by similarity in word choice and sentence length. Following that, we manipulate neuron activations to control the syntactic form of the output. We show this intervention to be somewhat successful, indicating that deep models capture sentence-structure distinctions, despite finding no such indication at the neuron level. To conduct our experiments, we develop a semi-automatic method to generate meaning-preserving minimal pair paraphrases (active-passive voice and adverbial clause-noun phrase) and compile a corpus of such pairs.
Anthology ID:
2022.conll-1.14
Volume:
Proceedings of the 26th Conference on Computational Natural Language Learning (CoNLL)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Antske Fokkens, Vivek Srikumar
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
194–212
Language:
URL:
https://aclanthology.org/2022.conll-1.14
DOI:
10.18653/v1/2022.conll-1.14
Bibkey:
Cite (ACL):
Gal Patel, Leshem Choshen, and Omri Abend. 2022. On Neurons Invariant to Sentence Structural Changes in Neural Machine Translation. In Proceedings of the 26th Conference on Computational Natural Language Learning (CoNLL), pages 194–212, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
On Neurons Invariant to Sentence Structural Changes in Neural Machine Translation (Patel et al., CoNLL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/2022.conll-1.14.pdf
Data:
 2022.conll-1.14.data.zip
Video:
 https://preview.aclanthology.org/naacl-24-ws-corrections/2022.conll-1.14.mp4