Zonglin Yang


2020

pdf
Improving Event Duration Prediction via Time-aware Pre-training
Zonglin Yang | Xinya Du | Alexander Rush | Claire Cardie
Findings of the Association for Computational Linguistics: EMNLP 2020

End-to-end models in NLP rarely encode external world knowledge about length of time. We introduce two effective models for duration prediction, which incorporate external knowledge by reading temporal-related news sentences (time-aware pre-training). Specifically, one model predicts the range/unit where the duration value falls in (R-PRED); and the other predicts the exact duration value (E-PRED). Our best model – E-PRED, substantially outperforms previous work, and captures duration information more accurately than R-PRED. We also demonstrate our models are capable of duration prediction in the unsupervised setting, outperforming the baselines.