Abstract
The principle of compositionality has deep roots in linguistics: the meaning of an expression is determined by its structure and the meanings of its constituents. However, modern neural network models such as long short-term memory network process expressions in a linear fashion and do not seem to incorporate more complex compositional patterns. In this work, we show that we can explicitly induce grammar by tracing the computational process of a long short-term memory network. We show: (i) the multiplicative nature of long short-term memory network allows complex interaction beyond sequential linear combination; (ii) we can generate compositional trees from the network without external linguistic knowledge; (iii) we evaluate the syntactic difference between the generated trees, randomly generated trees and gold reference trees produced by constituency parsers; (iv) we evaluate whether the generated trees contain the rich semantic information.- Anthology ID:
- 2020.acl-srw.40
- Volume:
- Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
- Month:
- July
- Year:
- 2020
- Address:
- Online
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 299–305
- Language:
- URL:
- https://aclanthology.org/2020.acl-srw.40
- DOI:
- 10.18653/v1/2020.acl-srw.40
- Cite (ACL):
- Yuhui Zhang and Allen Nie. 2020. Inducing Grammar from Long Short-Term Memory Networks by Shapley Decomposition. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 299–305, Online. Association for Computational Linguistics.
- Cite (Informal):
- Inducing Grammar from Long Short-Term Memory Networks by Shapley Decomposition (Zhang & Nie, ACL 2020)
- PDF:
- https://preview.aclanthology.org/paclic-22-ingestion/2020.acl-srw.40.pdf