Jing Yu
2021
Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees
Jiangang Bai
|
Yujing Wang
|
Yiren Chen
|
Yaming Yang
|
Jing Bai
|
Jing Yu
|
Yunhai Tong
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Pre-trained language models like BERT achieve superior performances in various NLP tasks without explicit consideration of syntactic information. Meanwhile, syntactic information has been proved to be crucial for the success of NLP applications. However, how to incorporate the syntax trees effectively and efficiently into pre-trained Transformers is still unsettled. In this paper, we address this problem by proposing a novel framework named Syntax-BERT. This framework works in a plug-and-play mode and is applicable to an arbitrary pre-trained checkpoint based on Transformer architecture. Experiments on various datasets of natural language understanding verify the effectiveness of syntax trees and achieve consistent improvement over multiple pre-trained models, including BERT, RoBERTa, and T5.
2020
Bi-directional CognitiveThinking Network for Machine Reading Comprehension
Wei Peng
|
Yue Hu
|
Luxi Xing
|
Yuqiang Xie
|
Jing Yu
|
Yajing Sun
|
Xiangpeng Wei
Proceedings of the 28th International Conference on Computational Linguistics
We propose a novel Bi-directional Cognitive Knowledge Framework (BCKF) for reading comprehension from the perspective of complementary learning systems theory. It aims to simulate two ways of thinking in the brain to answer questions, including reverse thinking and inertial thinking. To validate the effectiveness of our framework, we design a corresponding Bi-directional Cognitive Thinking Network (BCTN) to encode the passage and generate a question (answer) given an answer (question) and decouple the bi-directional knowledge. The model has the ability to reverse reasoning questions which can assist inertial thinking to generate more accurate answers. Competitive improvement is observed in DuReader dataset, confirming our hypothesis that bi-directional knowledge helps the QA task. The novel framework shows an interesting perspective on machine reading comprehension and cognitive science.
Search
Co-authors
- Jiangang Bai 1
- Yujing Wang 1
- Yiren Chen 1
- Yaming Yang 1
- Jing Bai 1
- show all...