Abstract
“Natural language understanding tasks require a comprehensive understanding of natural language and further reasoning about it, on the basis of holistic information at different levels to gain comprehensive knowledge. In recent years, pre-trained language models (PrLMs) have shown impressive performance in natural language understanding. However, they rely mainly on extracting context-sensitive statistical patterns without explicitly modeling linguistic information, such as semantic relationships entailed in natural language. In this work, we propose EventBERT, an event-based semantic representation model that takes BERT as the backbone and refines with event-based structural semantics in terms of graph convolution networks. EventBERT benefits simultaneously from rich event-based structures embodied in the graph and contextual semantics learned in pre-trained model BERT. Experimental results on the GLUE benchmark show that the proposed model consistently outperforms the baseline model.”- Anthology ID:
- 2022.ccl-1.69
- Volume:
- Proceedings of the 21st Chinese National Conference on Computational Linguistics
- Month:
- October
- Year:
- 2022
- Address:
- Nanchang, China
- Editors:
- Maosong Sun (孙茂松), Yang Liu (刘洋), Wanxiang Che (车万翔), Yang Feng (冯洋), Xipeng Qiu (邱锡鹏), Gaoqi Rao (饶高琦), Yubo Chen (陈玉博)
- Venue:
- CCL
- SIG:
- Publisher:
- Chinese Information Processing Society of China
- Note:
- Pages:
- 774–785
- Language:
- English
- URL:
- https://aclanthology.org/2022.ccl-1.69
- DOI:
- Cite (ACL):
- Zou Anni, Zhang Zhuosheng, and Zhao Hai. 2022. EventBERT: Incorporating Event-based Semantics for Natural Language Understanding. In Proceedings of the 21st Chinese National Conference on Computational Linguistics, pages 774–785, Nanchang, China. Chinese Information Processing Society of China.
- Cite (Informal):
- EventBERT: Incorporating Event-based Semantics for Natural Language Understanding (Anni et al., CCL 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2022.ccl-1.69.pdf
- Data
- CoLA, GLUE, MRPC, QNLI, SST, SST-2