Wordflow: Social Prompt Engineering for Large Language Models

Zijie Wang, Aishwarya Chakravarthy, David Munechika, Duen Horng Chau


Abstract
Large language models (LLMs) require well-crafted prompts for effective use. Prompt engineering, the process of designing prompts, is challenging, particularly for non-experts who are less familiar with AI technologies. While researchers have proposed techniques and tools to assist LLM users in prompt design, these works primarily target AI application developers rather than non-experts. To address this research gap, we propose social prompt engineering, a novel paradigm that leverages social computing techniques to facilitate collaborative prompt design. To investigate social prompt engineering, we introduce Wordflow, an open-source and social text editor that enables everyday users to easily create, run, share, and discover LLM prompts. Additionally, by leveraging modern web technologies, Wordflow allows users to run LLMs locally and privately in their browsers. Two usage scenarios highlight how social prompt engineering and our tool can enhance laypeople’s interaction with LLMs. Wordflow is publicly accessible at https://poloclub.github.io/wordflow.
Anthology ID:
2024.acl-demos.5
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Yixin Cao, Yang Feng, Deyi Xiong
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
42–50
Language:
URL:
https://aclanthology.org/2024.acl-demos.5
DOI:
Bibkey:
Cite (ACL):
Zijie Wang, Aishwarya Chakravarthy, David Munechika, and Duen Horng Chau. 2024. Wordflow: Social Prompt Engineering for Large Language Models. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations), pages 42–50, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Wordflow: Social Prompt Engineering for Large Language Models (Wang et al., ACL 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2024.acl-demos.5.pdf