Unsupervised Natural Language Parsing (Introductory Tutorial)

Kewei Tu, Yong Jiang, Wenjuan Han, Yanpeng Zhao


Abstract
Unsupervised parsing learns a syntactic parser from training sentences without parse tree annotations. Recently, there has been a resurgence of interest in unsupervised parsing, which can be attributed to the combination of two trends in the NLP community: a general trend towards unsupervised training or pre-training, and an emerging trend towards finding or modeling linguistic structures in neural models. In this tutorial, we will introduce to the general audience what unsupervised parsing does and how it can be useful for and beyond syntactic parsing. We will then provide a systematic overview of major classes of approaches to unsupervised parsing, namely generative and discriminative approaches, and analyze their relative strengths and weaknesses. We will cover both decade-old statistical approaches and more recent neural approaches to give the audience a sense of the historical and recent development of the field. We will also discuss emerging research topics such as BERT-based approaches and visually grounded learning.
Anthology ID:
2021.eacl-tutorials.1
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Tutorial Abstracts
Month:
April
Year:
2021
Address:
online
Editors:
Isabelle Augenstein, Ivan Habernal
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–5
Language:
URL:
https://aclanthology.org/2021.eacl-tutorials.1
DOI:
10.18653/v1/2021.eacl-tutorials.1
Bibkey:
Cite (ACL):
Kewei Tu, Yong Jiang, Wenjuan Han, and Yanpeng Zhao. 2021. Unsupervised Natural Language Parsing (Introductory Tutorial). In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Tutorial Abstracts, pages 1–5, online. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Natural Language Parsing (Introductory Tutorial) (Tu et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2021.eacl-tutorials.1.pdf