Zero-Shot Ranking Socio-Political Texts with Transformer Language Models to Reduce Close Reading Time

Kiymet Akdemir, Ali Hürriyetoğlu


Abstract
We approach the classification problem as an entailment problem and apply zero-shot ranking to socio-political texts. Documents that are ranked at the top can be considered positively classified documents and this reduces the close reading time for the information extraction process. We use Transformer Language Models to get the entailment probabilities and investigate different types of queries. We find that DeBERTa achieves higher mean average precision scores than RoBERTa and when declarative form of the class label is used as a query, it outperforms dictionary definition of the class label. We show that one can reduce the close reading time by taking some percentage of the ranked documents that the percentage depends on how much recall they want to achieve. However, our findings also show that percentage of the documents that should be read increases as the topic gets broader.
Anthology ID:
2022.case-1.17
Volume:
Proceedings of the 5th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Venue:
CASE
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
124–132
Language:
URL:
https://aclanthology.org/2022.case-1.17
DOI:
Bibkey:
Cite (ACL):
Kiymet Akdemir and Ali Hürriyetoğlu. 2022. Zero-Shot Ranking Socio-Political Texts with Transformer Language Models to Reduce Close Reading Time. In Proceedings of the 5th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE), pages 124–132, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Zero-Shot Ranking Socio-Political Texts with Transformer Language Models to Reduce Close Reading Time (Akdemir & Hürriyetoğlu, CASE 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.case-1.17.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.case-1.17.mp4