Randomized Deep Structured Prediction for Discourse-Level Processing

Manuel Widmoser, Maria Leonor Pacheco, Jean Honorio, Dan Goldwasser


Abstract
Expressive text encoders such as RNNs and Transformer Networks have been at the center of NLP models in recent work. Most of the effort has focused on sentence-level tasks, capturing the dependencies between words in a single sentence, or pairs of sentences. However, certain tasks, such as argumentation mining, require accounting for longer texts and complicated structural dependencies between them. Deep structured prediction is a general framework to combine the complementary strengths of expressive neural encoders and structured inference for highly structured domains. Nevertheless, when the need arises to go beyond sentences, most work relies on combining the output scores of independently trained classifiers. One of the main reasons for this is that constrained inference comes at a high computational cost. In this paper, we explore the use of randomized inference to alleviate this concern and show that we can efficiently leverage deep structured prediction and expressive neural encoders for a set of tasks involving complicated argumentative structures.
Anthology ID:
2021.eacl-main.100
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1174–1184
Language:
URL:
https://aclanthology.org/2021.eacl-main.100
DOI:
10.18653/v1/2021.eacl-main.100
Bibkey:
Cite (ACL):
Manuel Widmoser, Maria Leonor Pacheco, Jean Honorio, and Dan Goldwasser. 2021. Randomized Deep Structured Prediction for Discourse-Level Processing. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1174–1184, Online. Association for Computational Linguistics.
Cite (Informal):
Randomized Deep Structured Prediction for Discourse-Level Processing (Widmoser et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.eacl-main.100.pdf