Wen Junjie
2023
FinBART: A Pre-trained Seq2seq Language Model for Chinese Financial Tasks
Dong Hongyuan
|
Che Wanxiang
|
He Xiaoyu
|
Zheng Guidong
|
Wen Junjie
Proceedings of the 22nd Chinese National Conference on Computational Linguistics
“Pretrained language models are making a more profound impact on our lives than ever before. They exhibit promising performance on a variety of general domain Natural Language Process-ing (NLP) tasks. However, few work focuses on Chinese financial NLP tasks, which comprisea significant portion of social communication. To this end, we propose FinBART, a pretrainedseq2seq language model for Chinese financial communication tasks. Experiments show thatFinBART outperforms baseline models on a series of downstream tasks including text classifica-tion, sequence labeling and text generation. We further pretrain the model on customer servicecorpora, and results show that our model outperforms baseline models and achieves promisingperformance on various real world customer service text mining tasks.”
2022
All Information is Valuable: Question Matching over Full Information Transmission Network
Le Qi
|
Yu Zhang
|
Qingyu Yin
|
Guidong Zheng
|
Wen Junjie
|
Jinlong Li
|
Ting Liu
Findings of the Association for Computational Linguistics: NAACL 2022
Question matching is the task of identifying whether two questions have the same intent. For better reasoning the relationship between questions, existing studies adopt multiple interaction modules and perform multi-round reasoning via deep neural networks. In this process, there are two kinds of critical information that are commonly employed: the representation information of original questions and the interactive information between pairs of questions. However, previous studies tend to transmit only one kind of information, while failing to utilize both kinds of information simultaneously. To address this problem, in this paper, we propose a Full Information Transmission Network (FITN) that can transmit both representation and interactive information together in a simultaneous fashion. More specifically, we employ a novel memory-based attention for keeping and transmitting the interactive information through a global interaction matrix. Besides, we apply an original-average mixed connection method to effectively transmit the representation information between different reasoning rounds, which helps to preserve the original representation features of questions along with the historical hidden features. Experiments on two standard benchmarks demonstrate that our approach outperforms strong baseline models.
Search
Co-authors
- Dong Hongyuan 1
- Che Wanxiang 1
- He Xiaoyu 1
- Zheng Guidong 1
- Le Qi 1
- show all...