Contrastive Training Improves Zero-Shot Classification of Semi-structured Documents
Muhammad Khalifa, Yogarshi Vyas, Shuai Wang, Graham Horwood, Sunil Mallya, Miguel Ballesteros
Abstract
We investigate semi-structured document classification in a zero-shot setting. Classification of semi-structured documents is more challenging than that of standard unstructured documents, as positional, layout, and style information play a vital role in interpreting such documents. The standard classification setting where categories are fixed during both training and testing falls short in dynamic environments where new classification categories could potentially emerge. We focus exclusively on the zero-shot learning setting where inference is done on new unseen classes. To address this task, we propose a matching-based approach that relies on a pairwise contrastive objective for both pretraining and fine-tuning. Our results show a significant boost in Macro F1 from the proposed pretraining step and comparable performance of the contrastive fine-tuning to a standard prediction objective in both supervised and unsupervised zero-shot settings.- Anthology ID:
- 2023.findings-acl.473
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 7499–7508
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.473
- DOI:
- 10.18653/v1/2023.findings-acl.473
- Cite (ACL):
- Muhammad Khalifa, Yogarshi Vyas, Shuai Wang, Graham Horwood, Sunil Mallya, and Miguel Ballesteros. 2023. Contrastive Training Improves Zero-Shot Classification of Semi-structured Documents. In Findings of the Association for Computational Linguistics: ACL 2023, pages 7499–7508, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Contrastive Training Improves Zero-Shot Classification of Semi-structured Documents (Khalifa et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.473.pdf