Using Transformer-based Models for Taxonomy Enrichment and Sentence Classification

Parag Pravin Dakle, Shrikumar Patil, Sai Krishna Rallabandi, Chaitra Hegde, Preethi Raghavan


Abstract
In this paper, we present a system that addresses the taxonomy enrichment problem for Environment, Social and Governance issues in the financial domain, as well as classifying sentences as sustainable or unsustainable, for FinSim4-ESG, a shared task for the FinNLP workshop at IJCAI-2022. We first created a derived dataset for taxonomy enrichment by using a sentence-BERT-based paraphrase detector (Reimers and Gurevych, 2019) (on the train set) to create positive and negative term-concept pairs. We then model the problem by fine-tuning the sentence-BERT-based paraphrase detector on this derived dataset, and use it as the encoder, and use a Logistic Regression classifier as the decoder, resulting in test Accuracy: 0.6 and Avg. Rank: 1.97. In case of the sentence classification task, the best-performing classifier (Accuracy: 0.92) consists of a pre-trained RoBERTa model (Liu et al., 2019a) as the encoder and a Feed Forward Neural Network classifier as the decoder.
Anthology ID:
2022.finnlp-1.34
Volume:
Proceedings of the Fourth Workshop on Financial Technology and Natural Language Processing (FinNLP)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Chung-Chi Chen, Hen-Hsen Huang, Hiroya Takamura, Hsin-Hsi Chen
Venue:
FinNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
250–258
Language:
URL:
https://aclanthology.org/2022.finnlp-1.34
DOI:
10.18653/v1/2022.finnlp-1.34
Bibkey:
Cite (ACL):
Parag Pravin Dakle, Shrikumar Patil, Sai Krishna Rallabandi, Chaitra Hegde, and Preethi Raghavan. 2022. Using Transformer-based Models for Taxonomy Enrichment and Sentence Classification. In Proceedings of the Fourth Workshop on Financial Technology and Natural Language Processing (FinNLP), pages 250–258, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Using Transformer-based Models for Taxonomy Enrichment and Sentence Classification (Dakle et al., FinNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2022.finnlp-1.34.pdf