Tree Transformer’s Disambiguation Ability of Prepositional Phrase Attachment and Garden Path Effects

Lingling Zhou, Suzan Verberne, Gijs Wijnholds


Abstract
This work studies two types of ambiguity in natural language: prepositional phrase (PP) attachment ambiguity, and garden path constructions. Due to the different nature of these ambiguities – one being structural, the other incremental in nature – we pretrain and evaluate the Tree Transformer of Wang et al. (2019), an unsupervised Transformer model that induces tree representations internally. To assess PP attachment ambiguity we inspect the model’s induced parse trees against a newly prepared dataset derived from the PP attachment corpus (Ratnaparkhi et al., 1994). Measuring garden path effects is done by considering surprisal rates of the underlying language model on a number of dedicated test suites, following Futrell et al. (2019). For comparison we evaluate a pretrained supervised BiLSTM-based model trained on constituency parsing as sequence labelling (Gómez-Rodríguez and Vilares, 2018). Results show that the unsupervised Tree Transformer does exhibit garden path effects, but its parsing ability is far inferior to the supervised BiLSTM, and it is not as sensitive to lexical cues as other large LSTM models, suggesting that supervised parsers based on a pre-Transformer architecture may be the better choice in the presence of ambiguity.
Anthology ID:
2024.acl-long.664
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12291–12301
Language:
URL:
https://aclanthology.org/2024.acl-long.664
DOI:
10.18653/v1/2024.acl-long.664
Bibkey:
Cite (ACL):
Lingling Zhou, Suzan Verberne, and Gijs Wijnholds. 2024. Tree Transformer’s Disambiguation Ability of Prepositional Phrase Attachment and Garden Path Effects. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 12291–12301, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Tree Transformer’s Disambiguation Ability of Prepositional Phrase Attachment and Garden Path Effects (Zhou et al., ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2024.acl-long.664.pdf