Transcoding Compositionally: Using Attention to Find More Generalizable Solutions

Kris Korrel, Dieuwke Hupkes, Verna Dankers, Elia Bruni


Abstract
While sequence-to-sequence models have shown remarkable generalization power across several natural language tasks, their construct of solutions are argued to be less compositional than human-like generalization. In this paper, we present seq2attn, a new architecture that is specifically designed to exploit attention to find compositional patterns in the input. In seq2attn, the two standard components of an encoder-decoder model are connected via a transcoder, that modulates the information flow between them. We show that seq2attn can successfully generalize, without requiring any additional supervision, on two tasks which are specifically constructed to challenge the compositional skills of neural networks. The solutions found by the model are highly interpretable, allowing easy analysis of both the types of solutions that are found and potential causes for mistakes. We exploit this opportunity to introduce a new paradigm to test compositionality that studies the extent to which a model overgeneralizes when confronted with exceptions. We show that seq2attn exhibits such overgeneralization to a larger degree than a standard sequence-to-sequence model.
Anthology ID:
W19-4801
Volume:
Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Tal Linzen, Grzegorz Chrupała, Yonatan Belinkov, Dieuwke Hupkes
Venue:
BlackboxNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–11
Language:
URL:
https://aclanthology.org/W19-4801
DOI:
10.18653/v1/W19-4801
Bibkey:
Cite (ACL):
Kris Korrel, Dieuwke Hupkes, Verna Dankers, and Elia Bruni. 2019. Transcoding Compositionally: Using Attention to Find More Generalizable Solutions. In Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, pages 1–11, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Transcoding Compositionally: Using Attention to Find More Generalizable Solutions (Korrel et al., BlackboxNLP 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/W19-4801.pdf
Data
SCAN