A Comparative Analysis of the Effectiveness of Rare Tokens on Creative Expression using ramBERT

Youbin Lee, Deokgi Kim, Byung-Won On, Ingyu Lee


Abstract
Until now, few studies have been explored on Automated Creative Essay Scoring (ACES), in which a pre-trained model automatically labels an essay as a creative or a non-creative. Since the creativity evaluation of essays is very subjective, each evaluator often has his or her own criteria for creativity. For this reason, quantifying creativity in essays is very challenging. In this work, as one of preliminary studies in developing a novel model for ACES, we deeply investigate the correlation between creative essays and expressiveness. Specifically, we explore how rare tokens affect the evaluation of creativity for essays. For such a journey, we present five distinct methods to extract rare tokens, and conduct a comparative study on the correlation between rare tokens and creative essay evaluation results using BERT. Our experimental results showed clear correlation between rare tokens and creative essays. In all test sets, accuracies of our rare token masking-based BERT (ramBERT) model were improved over the existing BERT model up to 14%.
Anthology ID:
2023.findings-acl.639
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10063–10077
Language:
URL:
https://aclanthology.org/2023.findings-acl.639
DOI:
10.18653/v1/2023.findings-acl.639
Bibkey:
Cite (ACL):
Youbin Lee, Deokgi Kim, Byung-Won On, and Ingyu Lee. 2023. A Comparative Analysis of the Effectiveness of Rare Tokens on Creative Expression using ramBERT. In Findings of the Association for Computational Linguistics: ACL 2023, pages 10063–10077, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
A Comparative Analysis of the Effectiveness of Rare Tokens on Creative Expression using ramBERT (Lee et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.639.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.639.mp4