Graph-Induced Syntactic-Semantic Spaces in Transformer-Based Variational AutoEncoders

Yingji Zhang, Marco Valentino, Danilo Carvalho, Ian Pratt-Hartmann, Andre Freitas


Abstract
The injection of syntactic information in Variational AutoEncoders (VAEs) can result in an overall improvement of performances and generalisation. An effective strategy to achieve such a goal is to separate the encoding of distributional semantic features and syntactic structures into heterogeneous latent spaces via multi-task learning or dual encoder architectures. However, existing works employing such techniques are limited to LSTM-based VAEs. This work investigates latent space separation methods for structural syntactic injection in Transformer-based VAE architectures (i.e., Optimus) through the integration of graph-based models. Our empirical evaluation reveals that the proposed end-to-end VAE architecture can improve theoverall organisation of the latent space, alleviating the information loss occurring in standard VAE setups, and resulting in enhanced performances on language modelling and downstream generation tasks.
Anthology ID:
2024.findings-naacl.32
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
474–489
Language:
URL:
https://aclanthology.org/2024.findings-naacl.32
DOI:
Bibkey:
Cite (ACL):
Yingji Zhang, Marco Valentino, Danilo Carvalho, Ian Pratt-Hartmann, and Andre Freitas. 2024. Graph-Induced Syntactic-Semantic Spaces in Transformer-Based Variational AutoEncoders. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 474–489, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Graph-Induced Syntactic-Semantic Spaces in Transformer-Based Variational AutoEncoders (Zhang et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/2024.findings-naacl.32.pdf
Copyright:
 2024.findings-naacl.32.copyright.pdf