Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale

Xiang Hu, Pengyu Ji, Qingyang Zhu, Wei Wu, Kewei Tu


Abstract
A syntactic language model (SLM) incrementally generates a sentence with its syntactic tree in a left-to-right manner.We present Generative Pretrained Structured Transformers (GPST), an unsupervised SLM at scale capable of being pre-trained from scratch on raw texts with high parallelism. GPST circumvents the limitations of previous SLMs such as relying on gold trees and sequential training. It consists of two components, a usual SLM supervised by a uni-directional language modeling loss, and an additional composition model, which induces syntactic parse trees and computes constituent representations, supervised by a bi-directional language modeling loss. We propose a representation surrogate to enable joint parallel training of the two models in a hard-EM fashion.We pre-train GPST on OpenWebText, a corpus with billion tokens, and demonstrate the superiority of GPST over GPT-2 with a comparable size in numerous tasks covering both language understanding and language generation. Meanwhile, GPST also significantly outperforms existing unsupervised SLMs on left-to-right grammar induction, while holding a substantial acceleration on training.
Anthology ID:
2024.acl-long.145
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2640–2657
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.acl-long.145/
DOI:
10.18653/v1/2024.acl-long.145
Bibkey:
Cite (ACL):
Xiang Hu, Pengyu Ji, Qingyang Zhu, Wei Wu, and Kewei Tu. 2024. Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2640–2657, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Generative Pretrained Structured Transformers: Unsupervised Syntactic Language Models at Scale (Hu et al., ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.acl-long.145.pdf