Syntax-driven Iterative Expansion Language Models for Controllable Text Generation

Noe Casas, José A. R. Fonollosa, Marta R. Costa-jussà


Abstract
The dominant language modeling paradigm handles text as a sequence of discrete tokens. While that approach can capture the latent structure of the text, it is inherently constrained to sequential dynamics for text generation. We propose a new paradigm for introducing a syntactic inductive bias into neural text generation, where the dependency parse tree is used to drive the Transformer model to generate sentences iteratively. Our experiments show that this paradigm is effective at text generation, with quality between LSTMs and Transformers, and comparable diversity, requiring less than half their decoding steps, and its generation process allows direct control over the syntactic constructions of the generated text, enabling the induction of stylistic variations.
Anthology ID:
2020.spnlp-1.1
Volume:
Proceedings of the Fourth Workshop on Structured Prediction for NLP
Month:
November
Year:
2020
Address:
Online
Editors:
Priyanka Agrawal, Zornitsa Kozareva, Julia Kreutzer, Gerasimos Lampouras, André Martins, Sujith Ravi, Andreas Vlachos
Venue:
spnlp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/2020.spnlp-1.1
DOI:
10.18653/v1/2020.spnlp-1.1
Bibkey:
Cite (ACL):
Noe Casas, José A. R. Fonollosa, and Marta R. Costa-jussà. 2020. Syntax-driven Iterative Expansion Language Models for Controllable Text Generation. In Proceedings of the Fourth Workshop on Structured Prediction for NLP, pages 1–10, Online. Association for Computational Linguistics.
Cite (Informal):
Syntax-driven Iterative Expansion Language Models for Controllable Text Generation (Casas et al., spnlp 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2020.spnlp-1.1.pdf
Optional supplementary material:
 2020.spnlp-1.1.OptionalSupplementaryMaterial.pdf
Video:
 https://slideslive.com/38940163