To be Continuous, or to be Discrete, Those are Bits of Questions

Yiran Wang, Masao Utiyama


Abstract
Recently, binary representation has been proposed as a novel representation that lies between continuous and discrete representations. It exhibits considerable information-preserving capability when being used to replace continuous input vectors. In this paper, we investigate the feasibility of further introducing it to the output side, aiming to allow models to output binary labels instead. To preserve the structural information on the output side along with label information, we extend the previous contrastive hashing method as structured contrastive hashing. More specifically, we upgrade CKY from label-level to bit-level, define a new similarity function with span marginal probabilities, and introduce a novel contrastive loss function with a carefully designed instance selection strategy. Our model achieves competitive performance on various structured prediction tasks, and demonstrates that binary representation can be considered a novel representation that further bridges the gap between the continuous nature of deep learning and the discrete intrinsic property of natural languages.
Anthology ID:
2024.acl-long.436
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8036–8049
Language:
URL:
https://aclanthology.org/2024.acl-long.436
DOI:
Bibkey:
Cite (ACL):
Yiran Wang and Masao Utiyama. 2024. To be Continuous, or to be Discrete, Those are Bits of Questions. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8036–8049, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
To be Continuous, or to be Discrete, Those are Bits of Questions (Wang & Utiyama, ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2024.acl-long.436.pdf