Transformation of Dense and Sparse Text Representations

Wenpeng Hu, Mengyu Wang, Bing Liu, Feng Ji, Jinwen Ma, Dongyan Zhao


Abstract
Sparsity is regarded as a desirable property of representations, especially in terms of explanation. However, its usage has been limited due to the gap with dense representations. Most research progresses in NLP in recent years are based on dense representations. Thus the desirable property of sparsity cannot be leveraged. Inspired by Fourier Transformation, in this paper, we propose a novel Semantic Transformation method to bridge the dense and sparse spaces, which can facilitate the NLP research to shift from dense spaces to sparse spaces or to jointly use both spaces. Experiments using classification tasks and natural language inference task show that the proposed Semantic Transformation is effective.
Anthology ID:
2020.coling-main.290
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3257–3267
Language:
URL:
https://aclanthology.org/2020.coling-main.290
DOI:
10.18653/v1/2020.coling-main.290
Bibkey:
Cite (ACL):
Wenpeng Hu, Mengyu Wang, Bing Liu, Feng Ji, Jinwen Ma, and Dongyan Zhao. 2020. Transformation of Dense and Sparse Text Representations. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3257–3267, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Transformation of Dense and Sparse Text Representations (Hu et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nodalida-main-page/2020.coling-main.290.pdf
Code
 morning-dews/ST