Panel Abstract
As our world becomes increasingly digitized, algorithmic systems are reconfiguring ways we interact with technologies. Rather than simply delegating decisions from humans to machines, these systems mediate and transform human agencies and concerns. In this session, we focus on affective landscapes of digital developments and explore how values are enacted in algorithmic systems. We will examine the trade-offs, frictions and hopes that arise with the implementation and use, as well as the rejection, of such systems. By staying close to how these systems come into being and affect subjectivities, everyday lives, and organizational structures, we aim to demonstrate that our conceptual and theoretical choices influence how we think about past, present, possible, and impossible futures in relation to technologies. We will pay attention to different temporalities and spatialities at play in digital developments, explore the stability and instability of algorithmically-motivated practices and infrastructures, bridge various facets of digital development, and open perspectives to concerns that are currently neglected. Ultimately, our goal is to identify hopeful constellations in the current digitized world through a better understanding of the multifaceted nature of algorithmic systems.
Minna Ruckenstein- The Digital Geography of Fear
This paper zooms in to emotional articulations of how datafication ‘feels’ to query what they might tell us about the cultural shift in contemporary society that promotes affectively charged technology relations. In relocating experiences of fear and ‘mild paranoia’ from the personal to the collective sphere, one route forward is by way of analogy. The paper builds on Gil Valentine’s (1989) notion of the geography of fear that draws attention to gendered experiences in public spaces and introduces a parallel concept of the digital geography of fear to bring digital distress under a joint banner of structure of feeling. This opens a perspective on how people cope with the fact that they have limited knowledge and control over the dissemination and use of personal data. The lens of the digital geography of fear emphasizes the patterned nature of personal experiences and their links to underlying structures of power. Thus, attending to the affective infrastructure invites us to see fear and distress in a new light, as a form of collective harm. Personal experiences can be thought of as system failures, in a sense that they repeatedly reveal informational asymmetries and related practices of power. Emotional responses to algorithmic experiences call for new forms of collective action to better deal with present-day digital vulnerabilities.
Santeri Räisänen- The Aesthetics and Affect of Governmental High-Tech
The construction of technological futures is as much a site for cultural production as it is that for the production of technological artefacts. Researchers both in STS and cultural studies have turned towards cultural elements of technology, its imaginaries, narratives, myths and representations, as pivotal in the construction of a culture of technological progress. In this same line of research, I approach the visual representations of high-tech in a governmental artificial intelligence program which failed to produce artificial intelligence, and which in retrospection by the project planners never intended to do so. By analyzing performances of technological spectacle produced by the program, I recognize a shift in the visual representation of technology, from the functional, promissory, and demonstrative, to the abstracted and ambiguous and finally, the non-functional and purely aesthetic, ending in representations of high-tech where use-value is totally supplanted by sign-value. Drawing from Paulo Virno’s sentiments of disenchantment, I argue that this shifting representation of technology in the program is diagnostic of an ambivalent affect bubbling under in the technocratic bureaucracy: one marked at the same time by the abstract opportunism of high-tech and the cynical recognition of its empty façade. Ultimately, the possibility for hope in technopolitics is tempered by an uneasy tension with the empty signifiers around which it is built.
Laura Savolainen- Hopeful Algorithms: Imagining Recommendation Systems Beyond Algorithmic Markets and Autocracies
In this talk that draws and reflects on my concluded doctoral research on social media algorithms, I first frame algorithms as a sociological concern as sets of rules. Scholars have understood algorithms as both one example of, and a metaphor for, the invisible forces or hidden ‘laws’ that order and guide observable social behavior and phenomena. Thus, previous work on algorithmic power has searched for analogies from social forms seemingly rife with formal laws and rules: markets, bureaucracies, and games. By illustrating the value as well as limits of these notions, I seek in this presentation to move the discussion forward. I suggest we should conceive algorithmic orders not as linear and governed by fixed and rational rules, but as messy, unpredictable, and always emerging. Finally, I stress the importance of choosing our metaphors carefully, as they may limit the imagination regarding what could be. For instance, instead of the more obvious notions of markets or autocracies, what if we thought of algorithmic platforms as welfare states? Or, what if we took a break of governance metaphors completely, and thought about smaller corrections and repair work that could help recalibrate how algorithms become felt and lived with?