An Attentive Recurrent Model for Incremental Prediction of Sentence-final Verbs

Wenyan Li, Alvin Grissom II, Jordan Boyd-Graber


Abstract
Verb prediction is important for understanding human processing of verb-final languages, with practical applications to real-time simultaneous interpretation from verb-final to verb-medial languages. While previous approaches use classical statistical models, we introduce an attention-based neural model to incrementally predict final verbs on incomplete sentences in Japanese and German SOV sentences. To offer flexibility to the model, we further incorporate synonym awareness. Our approach both better predicts the final verbs in Japanese and German and provides more interpretable explanations of why those verbs are selected.
Anthology ID:
2020.findings-emnlp.12
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
126–136
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.12
DOI:
10.18653/v1/2020.findings-emnlp.12
Bibkey:
Cite (ACL):
Wenyan Li, Alvin Grissom II, and Jordan Boyd-Graber. 2020. An Attentive Recurrent Model for Incremental Prediction of Sentence-final Verbs. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 126–136, Online. Association for Computational Linguistics.
Cite (Informal):
An Attentive Recurrent Model for Incremental Prediction of Sentence-final Verbs (Li et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-dup-bibkey/2020.findings-emnlp.12.pdf
Optional supplementary material:
 2020.findings-emnlp.12.OptionalSupplementaryMaterial.zip