Enhancing Unsupervised Semantic Parsing with Distributed Contextual Representations

Zixuan Ling, Xiaoqing Zheng, Jianhan Xu, Jinshu Lin, Kai-Wei Chang, Cho-Jui Hsieh, Xuanjing Huang


Abstract
We extend a non-parametric Bayesian model of (Titov and Klementiev, 2011) to deal with homonymy and polysemy by leveraging distributed contextual word and phrase representations pre-trained on a large collection of unlabelled texts. Then, unsupervised semantic parsing is performed by decomposing sentences into fragments, clustering the fragments to abstract away syntactic variations of the same meaning, and predicting predicate-argument relations between the fragments. To better model the statistical dependencies between predicates and their arguments, we further conduct a hierarchical Pitman-Yor process. An improved Metropolis-Hastings merge-split sampler is proposed to speed up the mixing and convergence of Markov chains by leveraging pre-trained distributed representations. The experimental results show that the models achieve better accuracy on both question-answering and relation extraction tasks.
Anthology ID:
2023.findings-acl.726
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11454–11465
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-acl.726/
DOI:
10.18653/v1/2023.findings-acl.726
Bibkey:
Cite (ACL):
Zixuan Ling, Xiaoqing Zheng, Jianhan Xu, Jinshu Lin, Kai-Wei Chang, Cho-Jui Hsieh, and Xuanjing Huang. 2023. Enhancing Unsupervised Semantic Parsing with Distributed Contextual Representations. In Findings of the Association for Computational Linguistics: ACL 2023, pages 11454–11465, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Enhancing Unsupervised Semantic Parsing with Distributed Contextual Representations (Ling et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-acl.726.pdf
Video:
 https://preview.aclanthology.org/build-pipeline-with-new-library/2023.findings-acl.726.mp4