论文标题

算法系统的社会技术危害:范围减少损害的分类法

Sociotechnical Harms of Algorithmic Systems: Scoping a Taxonomy for Harm Reduction

论文作者

Shelby, Renee, Rismani, Shalaleh, Henne, Kathryn, Moon, AJung, Rostamzadeh, Negar, Nicholas, Paul, Yilla, N'Mah, Gallegos, Jess, Smart, Andrew, Garcia, Emilio, Virk, Gurleen

论文摘要

了解算法系统潜在危害的景观使从业者可以更好地预测其构建系统的后果。它还支持结合控制措施的前景,以帮助最大程度地减少技术与社会和文化动态的相互作用所产生的危害。越来越多的奖学金已经确定了不同算法技术的广泛危害。但是,计算研究和从业者缺乏高水平和综合算法系统危害的概述。基于对计算研究的范围审查$(n = 172)$,我们提出了社会技术危害的应用分类法,以支持对算法系统潜在危害的更系统的表面。最终的分类法基于现有的分类法,分类和术语。与社会技术危害有关的五个主要主题 - 代表性,分配,服务质量,人际危害和社会制度/社会危害 - 以及对这些类别的描述,以及子主题。我们最后讨论了未来研究的挑战和机会。

Understanding the landscape of potential harms from algorithmic systems enables practitioners to better anticipate consequences of the systems they build. It also supports the prospect of incorporating controls to help minimize harms that emerge from the interplay of technologies and social and cultural dynamics. A growing body of scholarship has identified a wide range of harms across different algorithmic technologies. However, computing research and practitioners lack a high level and synthesized overview of harms from algorithmic systems. Based on a scoping review of computing research $(n=172)$, we present an applied taxonomy of sociotechnical harms to support a more systematic surfacing of potential harms in algorithmic systems. The final taxonomy builds on and refers to existing taxonomies, classifications, and terminologies. Five major themes related to sociotechnical harms - representational, allocative, quality-of-service, interpersonal harms, and social system/societal harms - and sub-themes are presented along with a description of these categories. We conclude with a discussion of challenges and opportunities for future research.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源