Large Language Models as SocioTechnical Systems

Kaustubh Dhole


Abstract
The expectation of Large Language Models (LLMs) to solve various societal problems has ignored the larger socio-technical frame of reference under which they operate. From a socio-technical perspective, LLMs are necessary to look at separately from other ML models as they have radically different implications in society never witnessed before. In this article, we ground Selbst et al.(2019)’s five abstraction traps – The Framing Trap, The Portability Trap, The Formalism Trap, The Ripple Effect Trap and the Solutionism Trap in the context of LLMs discussing the problems associated with the abstraction and fairness of LLMs. Through learnings from previous studies and examples, we discuss each trap that LLMs fall into, and propose ways to address the points of LLM failure by gauging them from a socio-technical lens. We believe the discussions would provide a broader perspective of looking at LLMs through a sociotechnical lens and our recommendations could serve as baselines to effectively demarcate responsibilities among the various technical and social stakeholders and inspire future LLM research.
Anthology ID:
2023.bigpicture-1.6
Volume:
Proceedings of the Big Picture Workshop
Month:
December
Year:
2023
Address:
Singapore
Editors:
Yanai Elazar, Allyson Ettinger, Nora Kassner, Sebastian Ruder, Noah A. Smith
Venue:
BigPicture
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
66–79
Language:
URL:
https://aclanthology.org/2023.bigpicture-1.6
DOI:
10.18653/v1/2023.bigpicture-1.6
Bibkey:
Cite (ACL):
Kaustubh Dhole. 2023. Large Language Models as SocioTechnical Systems. In Proceedings of the Big Picture Workshop, pages 66–79, Singapore. Association for Computational Linguistics.
Cite (Informal):
Large Language Models as SocioTechnical Systems (Dhole, BigPicture 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/2023.bigpicture-1.6.pdf