This is just a quick note about another excellent piece from Nick Seaver on why it’s useful, and necessary, to study algorithms ethnographically or anthropologically.

Seaver, N. (2018). What should an anthropology of algorithms do? Cultural Anthropology, 33(3):375–385.

The main argument Seaver makes here is that its necessary because algorithms are never purely technical objects, and that they always have a human component:

If you cannot see a human in the loop, then you just need to look for a bigger loop. (p. 378)

This memorable phrase references the idea of the human-in-the-loop (HITL) from the study of Human Computer Interaction (HCI). The so called loop of interaction in HCI typically focuses on the interface layer, where a human works iteratively and collaboratively with a computer to achieve some goal. The ability of the human to reach that goal is generally taken as a measure of the effectiveness of the design. So HCI research is kind of fixated on design, or building a better mousetrap.

It’s important to note here that there are often two types of humans in the loop in this mode of HCI research: the user that is operating the computer, and the researcher that is studying the user using the computer, in order to design an interface or interaction. The interface (Hookway, 2014) that both connects the computer system with the user is the primary object of study. The goal is to design a better interface, mode of interaction, or technical affordance. But by widening the loop Seaver is expanding the subject of study to also include the designer or researcher. This move allows us to reflect not only on the effectiveness of a particular interface or interaction, but also what problems are selected for study in the first place.

The site of human and computer interaction has also been studied from a sociotechnical perspective, even ethnographically, for quite some time now. I’m thinking of the pioneering work by Suchman (1985), Star (1999), Orlikowski (2007) and that we can see reflected in the recent work of Dourish (2017) and others. In fact it is part of a general turn to practice that can be seen in a large swathe of HCI research (Kuutti & Bannon, 2014).

Seaver is doing something interesting here by saying that we cannot conceive of algorithmic operations as autonomous agents that are severed from human influence:

There is no independent software domain from which computers send forth their agents to insinuate themselves in to our cultural worlds. Like any other technology, algorithms can produce unanticipated results or prove challenging to manipulate, but they have no wicked autonomy. Especially in the case of the constantly tweaked and revised algorithms that filter news or make recommendations, these devices work at the pleasure of people who can change them, turn them off, or reorient them toward other goals. These choices emerge from particular cultural worlds, not from some technical outside. Algorithms, we can say, are culture–another name for a collection of human choices and practices (Seaver, 2017). If we want to understand how they work and especially how they change, we need to attend to people like Brad and the worlds of reference in which they make decisions, not to idealized visions of autonomous algorithmic logics.

On the one hand this seems in keeping with much sociotechnical research that treats the human and technical as two sides of the same coin and deeply interconnected. But Seaver is also making a somewhat provocative claim that alogorithms and computer systems lack autonomy and the ability to act as agents.

But remember the two users? When Seaver talks about the ability to change algorithms, turn them off, or reorient them, he is talking about the designer and not the general user to whom these algorithmic operations can often be completely obscured and very difficult if not impossible to introspect on or change, at least intentionally. The user is caught in a loop of interaction that they have not designed, and have very little ability to influence without (attempting) to remove themself from the loop entirely (Diakopoulos, 2014; Eubanks, 2018; O’Neil, 2016) or by attempts to hack them (Brunton & Nissenbaum, 2015). Considering the autonomy or agentness of algorithms is a question of relations where positionality matters.

In a way, I’m defending the validity of treating software as a fully fledged actor, not simply as a minion that does the bidding of human masters who are in complete control. These algorithmic systems bite back, often in ways we don’t expect or anticipate. I’m thinking specifically of how a deeply anthropological approach like Actor-Network-Theory offers a way of exploring the dense set of relations between people and computer systems, where the automated systems have a social, historical and material dimension that is not simply reducible to the humans that have designed them for a given purpose (Latour, 2005). Indeed our purposes and goals, the things that we design algorithms for, are conditioned and shaped by these non-human agents. Algorithms do not operate as agents independent of cultural worlds, but whether they have autonomy or not, they work on us as much as we work on them.

Seaver is asking us to tread carefully, and not to fall into the trap of overly mythologizing algorithms, and in doing so forgetting that they are in fact designed by us folk. Anthropology and ethnography provide a viable theory and methodology for developing an understanding of how algorithms are fashioned by a set of human practices. But it is important to remember who is doing the designing, who is doing the using, and who is being used. There are many types of humans in many types of loops.

References

Brunton, F., & Nissenbaum, H. (2015). Obfuscation: A user’s guide for privacy and protest. MIT Press.
Diakopoulos, N. (2014). Algorithmic accountability reporting: On the investigation of black boxes. Tow Center for Digital Journalism, Columbia University.
Dourish, P. (2017). The stuff of bits: An essay on the materialities of information. MIT Press.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
Hookway, B. (2014). Interface. MIT Press.
Kuutti, K., & Bannon, L. J. (2014). The turn to practice in HCI: Towards a research agenda. In Proceedings of the 32nd annual ACM Conference on Human Factors in Computing Systems (pp. 3543–3552). Retrieved from http://dl.acm.org/citation.cfm?id=2557111
Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford University Press.
O’Neil, C. (2016). Weapons of math destruction. Crown.
Orlikowski, W. J. (2007). Sociomaterial practices: Exploring technology at work. Organization Studies, 28(9), 1435–1448. https://doi.org/10.1177/0170840607081138
Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2). https://doi.org/10.1177/2053951717738104
Star, S. L. (1999). The ethnography of infrastructure. American Behavioral Scientist, 43(3), 377–391.
Suchman, L. (1985). Plans and situated actions: The problem of human-machine communication. Xerox Corporation.