Concept: Orit Halpern (Associate Professor Sociology and Anthropology, Concordia University Montreal), with Lisa Conrad, Irina Kaldrack (Wissenskulturen in im digitalen Zeitalter, Hochschule für Bildende Künste, Braunschweig), and Martina Leeker, June 2016, DCRL, Leuphana Universität Lüneburg
Introduction: Martina Leeker
Felix Stalder, culture and media theorist at Zurich University of the Arts (ZHdK), said in the introduction to the “Algorithmic Regimes and generative Strategies” conference in Vienna, 2015, that we need to develop new perspectives, approaches, attitudes, and methods in the context of algorithms (cf. an archive dedicated to algorithms: FUTURE NON STOP, A Living Archive for Digital Culture in Theory and Practice). Both technological determinism—the idea that technology shapes society—and technological constructivism—based on the assumption that technology is culturally produced via various institutions and agents—have been overtaken. Algorithms are, in a nutshell, far more performative. That is, they create the world that we live in and talk about functioning in a way that is no longer under human control. Instead, they develop their own sense of agency, one that is no longer completely transparent as concepts and technologies of “machine learning” (Alexander Löser, Christian Herta, Deep Learning, Maschinen, die wie Menschen lernen, 2015). The new blending of algorithmically-driven models and material reality has led to an epistemology of non-human performativity governed by unpredictability and precarious forms of existence.
Because this regime of algorithms can no longer be ignored, new methods must be developed in order to, firstly, understand them and make them intelligible, and secondly, to influence them and, preferably, to modify them. The basis for this is to determine, from research into the way algorithms function, the points at which they can be accessed and also where they close themselves off or are sealed off for political or economic reasons.
This endeavor should be guided by two maxims. One is the requirement for a broad analysis of algorithmic controls. Media theorist Wendy Chun, for example points out that, among others, “homophily,” (WuYu Huun, Huan, CAA, Hangzhou, Wendy Chun, Governing through Homophily, Network Science as Segregation, Lecture at the Conference “Forces of Reticulation”, China 2016), a concept from sociology that has been transferred into technological network theory, directs technological constitution through algorithm-driven networks and databanks. Homophily means “love of the same,” which excludes diversity already at a conceptual level and preprograms segregation. It follows that any intervention by which algorithms are to be viewed and altered must be founded not just on a technology-critical level but equally on a discursive level, and according to Wendy Chun, an analysis of “habits” (Wendy Chun, Paradoxes and Habits, 2014). (Cf. Wendy Chun Wendy Chun, Updating to Remain the Same. Habitual New Media 2016) Because it can be assumed, based on the concepts from which algorithms are generated, that they are extremely political. Their politics and regimes must be tracked.
The second maxim is grounded on a concept for the development of a different algorithm design. (forthcoming.) This approach must be carried out, as it were, from within the algorithmic regime because distanced, technology-critical methods could, according to Felix Stalder, be counterproductive since they often follow conventional ideas of the relationship between humans and technology. Stalder highlights this in the criticism of the “left” in the media culture industry, and an inhumane history of technology that passes the core of algorithmic operations and their effects to non-human performativity and can lead to passive rather than targeted acquisition of unfettered algorithms. The question is, therefore, how technology and algorithmic regimes can meet affirmatively and still be used in queer applications. How, for example, could automation be used when, according to Stalder, it could also potentially erode a basic element of capitalist order, wage labor, and thus destroy it, thus opening new options of thinking and organizing economy and politics. Jennifer Lyn Morone is following a critical affirmation with her project JLM Inc., in which she markets herself as a data provider and a company so that in this data-economy world she might at least profit from the sale of her data. Wendy Chun also puts emphasis on a queer handling of algorithms (forthcoming), and digital cultures when she insists on an ethic of difference and a condition of permanence in uncomfortable situations in order repeatedly and sustainably circumvent homophily.
The Reading Algorithms. Queer and transgressive Encounters, workshop was developed in the context of this double approach of criticism and affirmation, conceived by Orit Halpern, currently professor of sociology and anthropology at Concordia University in Montreal, Canada, in June 2016 as part of her fellowship at DCRL. Members of the implementation team were Lisa Conrad, researcher and lecturer in media and organization studies, media theorist Irina Kaldrack and Martina Leeker. The starting point was the idea that algorithms, which extensively permeate social, physical and political life, should be accessed in an unconventional way through being classified and presented creatively. An order of algorithms based on natural history was used as orientation with the goal of creating “field guides” that could be used to conduct guided tours of algorithms in various environments.
The results of the workshop are documented here. There are introductions by participants, which include the presentation of participant’s favorite algorithms selected in preparation for the workshop to be integrated into the field guides.
In a short lecture, Orit Halpern introduces the workshop’s theme and issues. She highlights unconventional perspectives on algorithms such as their acoustics in places like data centers, or their paths of movement when they control cranes, for example. These unique views correspond to aspects and characteristics that can be used to classify algorithms. Orit Halpern also outlines potential ordering systems, media of world production and world domination, and consequently, reflects a political aspect of the classification because they imply world-ordering models in which racist or colonial value systems are also implied. Such politically motivated or economically underpinned patterns also follow the modeling and operation of algorithms. To re-classify them is, therefore, to suspend the ordering patterns of cartographies and natural history as much as those of algorithms, and to speculate on their re-design into new orders of the world.
Following these introductions, the participants worked in groups on their field guide concepts for two days, and the results of the groups’ work were then presented.
List of participants
- Jamie Allen (Senior Researcher at the Critical Media Lab https://www.ixdm.ch/critical-media-lab/ in Basel, Switzerland), http://www.jamieallen.com/
- Clemens Apprich (Research Associate DCRL, Leuphana University Lüneburg) http://www.leuphana.de/en/university/staff-members/clemens-apprich.html
- Thomas Bächle (Universität Bonn, Institut für Sprach-, Medien- und Musikwissenschaft) https://www.medienwissenschaft.uni-bonn.de/abteilung/team-1/dr.-thomas-christian-baechle
- Armin Beverungen (Junior Director DCRL, Leuphana University Lüneburg) http://www.leuphana.de/en/university/staff-members/armin-beverungen.html
- Andreas Broeckmann (Leuphana Arts Program http://www.leuphana.de/en/arts-program.html, Leuphana University Lüneburg) http://www.leuphana.de/universitaet/personen/andreas-broeckmann.html
- Christoph Brunner (Prof. for Cultural Theory, Leuphana University Lüneburg) http://www.leuphana.de/universitaet/personen/christoph-brunner.html
- Marcus Burkhardt (Senior Researcher Munich Center for Technology in Society, Univerity Munich) https://www.mcts.tum.de/en/personen/researchers/marcus-burkhardt/
- Lisa Conrad (Research Associate DCRL, Leuphana University Lüneburg) http://www.leuphana.de/en/university/staff-members/lisa-conrad.html
- Christina Drachsler http://www.leuphana.de/dfg-programme/mecs/personen/christina-drachsler.html
- Maya Indira Ganesh (Director of Applied Research, Tactical Technology Collective https://tacticaltech.org/, Berlin) https://tacticaltech.org/team/maya-indira
- Jeremy Grosman (PhD-Student, University of Namur, Belgium) https://directory.unamur.be/staff/jgrosman
- Anne Haaning (Artist, London) http://www.annehaaning.com/
- Orit Halpern (Associate Professor Sociology and Anthropology, Concordia University Montreal, http://explore.concordia.ca/orit-halpern) http://orithalpern.net/
- Wiebke Jahn (Student Leuphana University Lüneburg)
- Irina Kaldrack (Wissenskulturen in im digitalen Zeitalter, Hochschule für Bildende Künste, Braunschweig) http://www.hbk-bs.de/hochschule/personen/irina-kaldrack/
- Martina Keup (Student Leuphana University Lüneburg) http://projects.digital-cultures.net/dcrl-experiments-interventions/environments-und-infrastrukturen/versehrte-dinge/die-projekte-versehrt/
- Ann-Christina Lange (Assistant Professor, Department of Management, Politics and Philosoph, Copenhagen Business School) http://www.cbs.dk/en/research/departments-and-centres/department-of-management-politics-and-philosophy/staff/alampp
- Martina Leeker (Senior Researcher, DCRL, Leuphana University Lüneburg) http://www.leuphana.de/en/university/staff-members/martina-leeker.html
- Liza Loop (Independent scholar and researcher Founder and Executive Director of LO*OP Center, Inc. Vision Keeper for History of Computing in Learning and Education Virtual Museum)
- Antonia Majaca (Curator and researcher at the IZK Institute for Contemporary Art at the Graz University of Technology) http://izk.tugraz.at/people/faculty-staff/visiting-professor-antonia-majaca/
- Marcell Mars (Research Associate DCRL, Leuphana University Lüneburg) https://monoskop.org/Marcell_Mars
- Frieder Nake (Computer and Art, University of Bremen) http://www.hfk-bremen.de/en/profiles/n/frieder-nake
- Johanna Ochner (Student Leuphana University Lüneburg)
- Ulf Otto (Dilthey-Fellow at Institut for Media, Theatre, and Popular Culture, University Hildesheim) http://ottó.de/personen/ulf-otto/
- Thomas Pringle (SSHRC Doctoral and Brown Presidential Fellow, Department of Modern Culture and Media, Brown University, Providence, Rhode Island USA) https://www.brown.edu/academics/modern-culture-and-media/thomas-pringle
- Robert Rapoport (Art History, University of Oxford, and Artistic Research) www.iterativeframe.com
- Jonathan Reus (Artist) https://jonathanreus.com/
- Renée Ridgway (PhD Fellow, Department of Management, Politics and Philosoph, Copenhagen Business School) http://reneeridgway.net/
- Ned Rossiter (Institute for Culture and Society Western Sydney University) https://www.westernsydney.edu.au/ics/people/researchers/ned_rossiter
- Christopher Salter (Research Chair in New Media, Technology and the Sense, Concordia University Montreal) http://chrissalter.com/
- Dubravka Sekulić (Assistant Professur, Institut for Contemporay art, Technical University Graz) http://izk.tugraz.at/people/faculty-staff/assistant-professor-dubravka-sekulic/
- Georg Trogemann (Professor for Experimentel Computing, Academy of Medias Arts Cologne) https://www.khm.de/lehrende/id.21401.prof-dr-georg-trogemann/
- Georg Werner (Artist, Experimental Media) http://www.georgwerner.de/
Today we live in environments where computation and life have become inextricably intertwined. Terms such as “the anthropocene” and “the technosphere” are attempts to articulate and represent this phenommenal switch in the relationship of computation, technology, media and life. But in actuality it might be The Age of The Algorithm, and we have become hommus algorithmus—its your logically legible codes from DNA to FICO scores that define who you are and what you can do… and if it can’t be rendered computable, not to worry, someone somewhere is trying…
So if the anthropocene defines a new “geological” age then we need a new “natural” history! A new taxonomy for life, techne, and earth! This workshop will produce a new “field” guide to identify, classify, and intervene (perhaps even re-imagine) that most distinct phylum of agents—the Algorithms.
The importance of this project cannot be understated… We will engage with and perhaps save everyone and everything! Are you tardy to an airport and having to get through security? How do you know if the NSA is surveiling you? And what they are looking for? What is that new identity card for anyway? Have you not lost enought weight with your fitbit? Dealing with Frontex? Trying to manage disaster relief before environmental and social catastrophes? Trying to break your parent’s locks so you can watch porn? Lots of things whether you are an immigrant, home owner, or just trying to have sex as a teenager get managed by algorithms. Their identification, operations, and ontologies are therefore critical to any intervention or action on contemporary techno-political-economies.
Be clear, misidentification may be dangerous to other species or yourself, errors in classification will turn into errors in engineering!
Format: A coffee table extremely well-designed book that will be just what it says it is the definitive “field” guide to algorithms, along with a supplementary website and platform for crowd-sourcing new discoveries of species, developing visualization and iconographic representations, and updating our field guides. The goal is to never write a grant again, this will replace “Banksy” book at Urban Outfitters. It will be the bird watcher’s-Baedeckers of the technosphere-anthropocene.