Bin Liu
2020
Learning distributed sentence vectors with bi-directional 3D convolutions
Bin Liu
|
Liang Wang
|
Guosheng Yin
Proceedings of the 28th International Conference on Computational Linguistics
We propose to learn distributed sentence representation using text’s visual features as input. Different from the existing methods that render the words or characters of a sentence into images separately, we further fold these images into a 3-dimensional sentence tensor. Then, multiple 3-dimensional convolutions with different lengths (the third dimension) are applied to the sentence tensor, which act as bi-gram, tri-gram, quad-gram, and even five-gram detectors jointly. Similar to the Bi-LSTM, these n-gram detectors learn both forward and backward distributional semantic knowledge from the sentence tensor. That is, the proposed model using bi-directional convolutions to learn text embedding according to the semantic order of words. The feature maps from the two directions are concatenated for final sentence embedding learning. Our model involves only a single-layer of convolution which makes it easy and fast to train. Finally, we evaluate the sentence embeddings on several downstream Natural Language Processing (NLP) tasks, which demonstrate a surprisingly excellent performance of the proposed model.
2014
Cross-lingual Opinion Analysis via Negative Transfer Detection
Lin Gui
|
Ruifeng Xu
|
Qin Lu
|
Jun Xu
|
Jian Xu
|
Bin Liu
|
Xiaolong Wang
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)