Mairgup Mansur

Also published as: Mansur Mairgup


2018

pdf bib
Improved Dependency Parsing using Implicit Word Connections Learned from Unlabeled Data
Wenhui Wang | Baobao Chang | Mairgup Mansur
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing

Pre-trained word embeddings and language model have been shown useful in a lot of tasks. However, both of them cannot directly capture word connections in a sentence, which is important for dependency parsing given its goal is to establish dependency relations between words. In this paper, we propose to implicitly capture word connections from unlabeled data by a word ordering model with self-attention mechanism. Experiments show that these implicit word connections do improve our parsing model. Furthermore, by combining with a pre-trained language model, our model gets state-of-the-art performance on the English PTB dataset, achieving 96.35% UAS and 95.25% LAS.

2013

pdf bib
Exploring Representations from Unlabeled Data with Co-training for Chinese Word Segmentation
Longkai Zhang | Houfeng Wang | Xu Sun | Mairgup Mansur
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing

pdf bib
Feature-based Neural Language Model and Chinese Word Segmentation
Mairgup Mansur | Wenzhe Pei | Baobao Chang
Proceedings of the Sixth International Joint Conference on Natural Language Processing

2010

pdf bib
Chinese word segmentation model using bootstrapping
Baobao Chang | Mansur Mairgup
CIPS-SIGHAN Joint Conference on Chinese Language Processing