Xiaofeng Chen
2022
Modeling Compositionality with Dependency Graph for Dialogue Generation
Xiaofeng Chen
|
Yirong Chen
|
Xiaofen Xing
|
Xiangmin Xu
|
Wenjing Han
|
Qianfeng Tie
Proceedings of the Workshop on Structured and Unstructured Knowledge Integration (SUKI)
Because of the compositionality of natural language, syntactic structure which contains the information about the relationship between words is a key factor for semantic understanding. However, the widely adopted Transformer is hard to learn the syntactic structure effectively in dialogue generation tasks. To explicitly model the compositionaity of language in Transformer Block, we restrict the information flow between words by constructing directed dependency graph and propose Dependency Relation Attention (DRA). Experimental results demonstrate that DRA can further improve the performance of state-of-the-art models for dialogue generation.
2020
A Two-phase Prototypical Network Model for Incremental Few-shot Relation Classification
Haopeng Ren
|
Yi Cai
|
Xiaofeng Chen
|
Guohua Wang
|
Qing Li
Proceedings of the 28th International Conference on Computational Linguistics
Relation Classification (RC) plays an important role in natural language processing (NLP). Current conventional supervised and distantly supervised RC models always make a closed-world assumption which ignores the emergence of novel relations in open environment. To incrementally recognize the novel relations, current two solutions (i.e, re-training and lifelong learning) are designed but suffer from the lack of large-scale labeled data for novel relations. Meanwhile, prototypical network enjoys better performance on both fields of deep supervised learning and few-shot learning. However, it still suffers from the incompatible feature embedding problem when the novel relations come in. Motivated by them, we propose a two-phase prototypical network with prototype attention alignment and triplet loss to dynamically recognize the novel relations with a few support instances meanwhile without catastrophic forgetting. Extensive experiments are conducted to evaluate the effectiveness of our proposed model.
Search
Co-authors
- Haopeng Ren 1
- Yi Cai 1
- Guohua Wang 1
- Qing Li 1
- Yirong Chen 1
- show all...