Reading Algorithms

Queer and transgressive Encounters. A Workshop

— Concept: Orit Halpern (Associate Professor Sociology and Anthropology, Concordia University Montreal), with Lisa Conrad, Irina Kaldrack (Wissenskulturen in im digitalen Zeitalter, Hochschule für Bildende Künste, Braunschweig), and Martina Leeker, June 2016, DCRL, Leuphana Universität Lüneburg
— Introduction: Martina Leeker

Felix Stalder, culture and media theorist at Zurich University of the Arts (ZHdK), said in the introduction to the “Algorithmic Regimes and generative Strategies” conference in Vienna, 2015, that we need to develop new perspectives, approaches, attitudes, and methods in the context of algorithms (cf. an archive dedicated to algorithms: FUTURE NON STOP, A Living Archive for Digital Culture in Theory and Practice). Both technological determinism—the idea that technology shapes society—and technological constructivism—based on the assumption that technology is culturally produced via various institutions and agents—have been overtaken. Algorithms are, in a nutshell, far more performative. That is, they create the world that we live in and talk about functioning in a way that is no longer under human control. Instead, they develop their own sense of agency, one that is no longer completely transparent as concepts and technologies of “machine learning” (Alexander Löser, Christian Herta, Deep Learning, Maschinen, die wie Menschen lernen, 2015). The new blending of algorithmically-driven models and material reality has led to an epistemology of non-human performativity governed by unpredictability and precarious forms of existence.

Because this regime of algorithms can no longer be ignored, new methods must be developed in order to, firstly, understand them and make them intelligible, and secondly, to influence them and, preferably, to modify them. The basis for this is to determine, from research into the way algorithms function, the points at which they can be accessed and also where they close themselves off or are sealed off for political or economic reasons.

This endeavor should be guided by two maxims. One is the requirement for a broad analysis of algorithmic controls. Media theorist Wendy Chun, for example points out that, among others, “homophily,” (WuYu Huun, Huan, CAA, Hangzhou, Wendy Chun, Governing through Homophily, Network Science as Segregation, Lecture at the Conference “Forces of Reticulation”, China 2016), a concept from sociology that has been transferred into technological network theory, directs technological constitution through algorithm-driven networks and databanks. Homophily means “love of the same,” which excludes diversity already at a conceptual level and preprograms segregation. It follows that any intervention by which algorithms are to be viewed and altered must be founded not just on a technology-critical level but equally on a discursive level, and according to Wendy Chun, an analysis of “habits” (Wendy Chun, Paradoxes and Habits, 2014). (Cf. Wendy Chun Wendy Chun, Updating to Remain the Same. Habitual New Media 2016) Because it can be assumed, based on the concepts from which algorithms are generated, that they are extremely political. Their politics and regimes must be tracked.

The second maxim is grounded on a concept for the development of a different algorithm design. (forthcoming.) This approach must be carried out, as it were, from within the algorithmic regime because distanced, technology-critical methods could, according to Felix Stalder, be counterproductive since they often follow conventional ideas of the relationship between humans and technology. Stalder highlights this in the criticism of the “left” in the media culture industry, and an inhumane history of technology that passes the core of algorithmic operations and their effects to non-human performativity and can lead to passive rather than targeted acquisition of unfettered algorithms. The question is, therefore, how technology and algorithmic regimes can meet affirmatively and still be used in queer applications. How, for example, could automation be used when, according to Stalder, it could also potentially erode a basic element of capitalist order, wage labor, and thus destroy it, thus opening new options of thinking and organizing economy and politics. Jennifer Lyn Morone is following a critical affirmation with her project JLM Inc., in which she markets herself as a data provider and a company so that in this data-economy world she might at least profit from the sale of her data. Wendy Chun also puts emphasis on a queer handling of algorithms (forthcoming), and digital cultures when she insists on an ethic of difference and a condition of permanence in uncomfortable situations in order repeatedly and sustainably circumvent homophily.

The Reading Algorithms. Queer and transgressive Encounters, workshop was developed in the context of this double approach of criticism and affirmation, conceived by Orit Halpern, currently professor of sociology and anthropology at Concordia University in Montreal, Canada, in June 2016 as part of her fellowship at DCRL. Members of the implementation team were Lisa Conrad, researcher and lecturer in media and organization studies, media theorist Irina Kaldrack and Martina Leeker. The starting point was the idea that algorithms, which extensively permeate social, physical and political life, should be accessed in an unconventional way through being classified and presented creatively. An order of algorithms based on natural history was used as orientation with the goal of creating “field guides” that could be used to conduct guided tours of algorithms in various environments.

The results of the workshop are documented here. There are introductions by participants, which include the presentation of participant’s favorite algorithms selected in preparation for the workshop to be integrated into the field guides.

In a short lecture, Orit Halpern introduces the workshop’s theme and issues. She highlights unconventional perspectives on algorithms such as their acoustics in places like data centers, or their paths of movement when they control cranes, for example. These unique views correspond to aspects and characteristics that can be used to classify algorithms. Orit Halpern also outlines potential ordering systems, media of world production and world domination, and consequently, reflects a political aspect of the classification because they imply world-ordering models in which racist or colonial value systems are also implied. Such politically motivated or economically underpinned patterns also follow the modeling and operation of algorithms. To re-classify them is, therefore, to suspend the ordering patterns of cartographies and natural history as much as those of algorithms, and to speculate on their re-design into new orders of the world.

Following these introductions, the participants worked in groups on their field guide concepts for two days, and the results of the groups’ work were then presented.

List of participants

About us

Re-thinking methods: Experiments&Interventions als Methode eines kontinuierlichen und diskursanalytischen Überdenkens und Umdenkens von Bedingungen und Perspektiven digitaler Kulturen und ihrer Erforschung. Mehr dazu auf der Seite Über uns...

Das DCRL ist Teil des Centre for Digital Cultures (CDC) an der Leuphana Universität Lüneburg.

Kontakthof 2.0

Das Theaterprojekt „Kontakthof 2.0. Living in digital cultures“ wurde für die Verabschiedung des Innovations-Inkubators, Bereich Digitale Medien der Leuphana Universität Lüneburg erarbeitet, ein von 2009 bis 2015 von der EU gefördertes Projekt zur regionalen Entwicklung. Die Aufführung fand am 24.6.2015 im Rahmen der Abschlussveranstaltung des Inkubators an der Universität in einem Seminarsaal statt. Das Stück wurde in einem Seminar mit Studierenden zum Thema „TheorieTheater“ entwickelt. Eine ausführliche Projektdokumentation sowie den dazugehörigen Mitschnitt finden sie auf der Unterseite “Kontakthof 2.0. Embodiment of Remix und TheorieTheater“.