Xiaobao Wu


2022

pdf
Mitigating Data Sparsity for Short Text Topic Modeling by Topic-Semantic Contrastive Learning
Xiaobao Wu | Anh Tuan Luu | Xinshuai Dong
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing

To overcome the data sparsity issue in short text topic modeling, existing methods commonly rely on data augmentation or the data characteristic of short texts to introduce more word co-occurrence information. However, most of them do not make full use of the augmented data or the data characteristic: they insufficiently learn the relations among samples in data, leading to dissimilar topic distributions of semantically similar text pairs. To better address data sparsity, in this paper we propose a novel short text topic modeling framework, Topic-Semantic Contrastive Topic Model (TSCTM). To sufficiently model the relations among samples, we employ a new contrastive learning method with efficient positive and negative sampling strategies based on topic semantics. This contrastive learning method refines the representations, enriches the learning signals, and thus mitigates the sparsity issue. Extensive experimental results show that our TSCTM outperforms state-of-the-art baselines regardless of the data augmentation availability, producing high-quality topics and topic distributions.

pdf
Adaptive Contrastive Learning on Multimodal Transformer for Review Helpfulness Prediction
Thong Nguyen | Xiaobao Wu | Anh Tuan Luu | Zhen Hai | Lidong Bing
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing

Modern Review Helpfulness Prediction systems are dependent upon multiple modalities, typically texts and images. Unfortunately, those contemporary approaches pay scarce attention to polish representations of cross-modal relations and tend to suffer from inferior optimization. This might cause harm to model’s predictions in numerous cases. To overcome the aforementioned issues, we propose Multi-modal Contrastive Learning for Multimodal Review Helpfulness Prediction (MRHP) problem, concentrating on mutual information between input modalities to explicitly elaborate cross-modal relations. In addition, we introduce Adaptive Weighting scheme for our contrastive learning approach in order to increase flexibility in optimization. Lastly, we propose Multimodal Interaction module to address the unalignment nature of multimodal data, thereby assisting the model in producing more reasonable multimodal representations. Experimental results show that our method outperforms prior baselines and achieves state-of-the-art results on two publicly available benchmark datasets for MRHP problem.

2021

pdf
Discovering Topics in Long-tailed Corpora with Causal Intervention
Xiaobao Wu | Chunping Li | Yishu Miao
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021

2020

pdf
Short Text Topic Modeling with Topic Distribution Quantization and Negative Sampling Decoder
Xiaobao Wu | Chunping Li | Yan Zhu | Yishu Miao
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

Topic models have been prevailing for many years on discovering latent semantics while modeling long documents. However, for short texts they generally suffer from data sparsity because of extremely limited word co-occurrences; thus tend to yield repetitive or trivial topics with low quality. In this paper, to address this issue, we propose a novel neural topic model in the framework of autoencoding with a new topic distribution quantization approach generating peakier distributions that are more appropriate for modeling short texts. Besides the encoding, to tackle this issue in terms of decoding, we further propose a novel negative sampling decoder learning from negative samples to avoid yielding repetitive topics. We observe that our model can highly improve short text topic modeling performance. Through extensive experiments on real-world datasets, we demonstrate our model can outperform both strong traditional and neural baselines under extreme data sparsity scenes, producing high-quality topics.