How do Words Contribute to Sentence Semantics? Revisiting Sentence Embeddings with a Perturbation Method

Wenlin Yao, Lifeng Jin, Hongming Zhang, Xiaoman Pan, Kaiqiang Song, Dian Yu, Dong Yu, Jianshu Chen


Abstract
Understanding sentence semantics requires an interpretation of the main information from a concrete context. To investigate how individual word contributes to sentence semantics, we propose a perturbation method for unsupervised semantic analysis. We next re-examine SOTA sentence embedding models’ ability to capture the main semantics of a sentence by developing a new evaluation metric to adapt sentence compression datasets for automatic evaluation. Results on three datasets show that unsupervised discourse relation recognition can serve as a general inference task that can more effectively aggregate information to essential contents than several SOTA unsupervised sentence embedding models.
Anthology ID:
2023.eacl-main.218
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3001–3010
Language:
URL:
https://aclanthology.org/2023.eacl-main.218
DOI:
Bibkey:
Cite (ACL):
Wenlin Yao, Lifeng Jin, Hongming Zhang, Xiaoman Pan, Kaiqiang Song, Dian Yu, Dong Yu, and Jianshu Chen. 2023. How do Words Contribute to Sentence Semantics? Revisiting Sentence Embeddings with a Perturbation Method. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3001–3010, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
How do Words Contribute to Sentence Semantics? Revisiting Sentence Embeddings with a Perturbation Method (Yao et al., EACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/remove-xml-comments/2023.eacl-main.218.pdf
Video:
 https://preview.aclanthology.org/remove-xml-comments/2023.eacl-main.218.mp4