Nick Seaver’s recent article Algorithms as Culture has some really good guidance for folks wanting to study algorithmic systems using ethnographic methods (Seaver, 2017). The paper discusses a set of practical techniques, and provides much needed (IMHO) information about the practical craft of critical algorithm studies. Seaver points out that algorithmic systems aren’t simply black boxes, or sites that can be opened and understood. Studying algorithms requires methodologies that recognize how they are deployed in the world as part of culture.

Algorithmic systems can be difficult to study because they often don’t live in a particular place, and aren’t known by one set of individuals. Algorithms are often distributed between systems and workflows that combine computation with people and their activities and experiences. Studying algorithmic systems often puts up some real road blocks for data collection using traditional participant observation settings. Attempts to understand algorithmic processes often puts the researcher right into the beating heart of an organization, where information can often be guarded for competitive reasons, or because the information itself could allow the company’s services to be subverted or gamed. Increasingly, research on algorithms could lead to criticism or bad press which many organizations will react to negatively.

Seaver has several pieces of advice, and its worth reading the full article to understand them all. But the big ones for me were the importance of scavenging and the interview as ethnographic tools. Scavenging is a technique of using scraps of publicly available information, and constellating them in useful ways:

If our interest is not in the specific configuration of a particular algorithm at one moment in time, but in the more persistent cultural worlds algorithms are part of, then useful evidence is not bounded by corporate secrecy. In my own research, I learned from off-the-record chats with engineers about industry scuttlebut, triangulated with press releases and the social media updates of my interlocutors. Sometimes, interviewees stuck resolutely to the company line; other times, often after several interviews, they spilled the beans. In academic and industry conference hallways, people working in diverse sites talked across their differences and around their various obligations to secrecy, providing a rich source of information about how algorithms and their meanings vary.

This idea of scavenging reminded me a bit of journalism, but really draws on work by Gusterson (1997). Gusterson developed a technique of polymorphous engagement which allowed him to study the culture of nuclear weapons scientists. Seaver gives this technique the much more evocative name scavenging. In general this technique is useful in situations where the researcher is [studying up], or attempting to understand a setting to which one is denied access, where there is a power differential between the subject of study and the one doing the studying.

As one scavenges for information Seaver advises ethnographers to reflect on the nature of the access they are provided or discover. He calls this the texture of access. Keeping field notes about what how information is gathered is a really key part of this process. He points to the work of Jensen (2010) who studies the barriers to access, such as non-disclosure agreements, as sites for study.

While ethnographers have a preference for observing what people do, rather than asking questions of what people do, Seaver suggests that studying algorithms often relies on interview data, that is gathered in a variety of settings. These interviews often cannot take place in controlled environments, so the researcher needs to be flexible, and adapt to situations as they arise. Here Seaver is also pointing to the importance of studying language because people do things with their words. He cites Bakhtin’s idea of heteroglossia as a way to dig into public corporate speak. It also suggests discourse analysis and linguistic anthropology and ethnography of communication as viable theories for doing this type of research.

Lastly, and perhaps most importantly for my own work, Seaver connects the dots between studying computational systems and Practice Theory, specifically the praxiography of Annemarie Mol. Seaver agrees with Dourish (2016) that the word algorithm means different things to different people. But Seaver makes the interesting connection to Mol’s idea of objects being multiple, and being generated by practices. These practices themselves are worthy of study, especially when practices intersect with each other. Seaver cites Gad, Jensen, & Winthereik (2015) who calls this practical ontology which could be useful to follow up on. The main point Seaver is making is that for the purposes of anthropology it’s useful to look at algorithms as culture, and not a separate object that is somehow outside of culture…which is more the perspective from computer science, and to some extent platform studies.

References

Dourish, P. (2016). Algorithms and their others: Algorithmic culture in context. Big Data & Society, 3(2).
Gad, C., Jensen, C. B., & Winthereik, B. R. (2015). Practical ontology: Worlds in STS and anthropology. NatureCulture, (3), 67–86.
Gusterson, H. (1997). Studying up revisited. PoLAR: Political and Legal Anthropology Review, 20(1), 114–119.
Jensen, C. B. (2010). Asymmetries of knowledge: Mediated ethnography and ICT for development. Methodological Innovations Online, 5(1), 72–85.
Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2). https://doi.org/10.1177/2053951717738104