Wei Ding


Contrastive Learning of Sentence Representations
Hefei Qiu | Wei Ding | Ping Chen
Proceedings of the 18th International Conference on Natural Language Processing (ICON)

Learning sentence representations which capture rich semantic meanings has been crucial for many NLP tasks. Pre-trained language models such as BERT have achieved great success in NLP, but sentence embeddings extracted directly from these models do not perform well without fine-tuning. We propose Contrastive Learning of Sentence Representations (CLSR), a novel approach which applies contrastive learning to learn universal sentence representations on top of pre-trained language models. CLSR utilizes semantic similarity of two sentences to construct positive instance for contrastive learning. Semantic information that has been captured by the pre-trained models is kept by getting sentence embeddings from these models with proper pooling strategy. An encoder followed by a linear projection takes these embeddings as inputs and is trained under a contrastive objective. To evaluate the performance of CLSR, we run experiments on a range of pre-trained language models and their variants on a series of Semantic Contextual Similarity tasks. Results show that CLSR gains significant performance improvements over existing SOTA language models.


TreeMatch: A Fully Unsupervised WSD System Using Dependency Knowledge on a Specific Domain
Andrew Tran | Chris Bowes | David Brown | Ping Chen | Max Choly | Wei Ding
Proceedings of the 5th International Workshop on Semantic Evaluation


A Fully Unsupervised Word Sense Disambiguation Method Using Dependency Knowledge
Ping Chen | Wei Ding | Chris Bowes | David Brown
Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American Chapter of the Association for Computational Linguistics


CHINERS: A Chinese Named Entity Recognition System for the Sports Domain
Tianfang Yao | Wei Ding | Gregor Erbach
Proceedings of the Second SIGHAN Workshop on Chinese Language Processing