|Daniel Rich, Drone, 2011.|
|Daniel Rich, BT Tower, Birmingham, UK, 2010.|
|Daniel Rich, Obama's Visit to Bagdad, 2008.|
|Daniel Rich, Archive/Berlin, 2010.|
Algorithms play an increasingly important role in selecting what information is considered most relevant to us, a crucial feature of our participation in public life. Search engines help us navigate massive databases of information, or the entire web. Recommendation algorithms map our preferences against others, suggesting new or forgotten bits of culture for us to encounter. Algorithms manage our interactions on social networking sites, highlighting the news of one friend while excluding another's. Algorithms designed to calculate what is "hot" or "trending" or "most discussed" skim the cream from the seemingly boundless chatter that's on offer. Together, these algorithms not only help us find information, they provide a means to know what there is to know and how to know it, to participate in social and political discourse, and to familiarize ourselves with the publics in which we participate. They are now a key logic governing the flows of information on which we depend, with the "power to enable and assign meaningfulness, managing how information is perceived by users, the 'distribution of the sensible.'" (Langlois 2012)
Algorithms need not be software : in the broadest sense, they are encoded procedures for transforming input data into a desired output, based on specified calculations. The procedures name both a problem and the steps by which it should be solved. Instructions for navigation may be considered an algorithm, or the mathematical formulas required to predict the movement of a celestial body across the sky. "Algorithms do things, and their syntax embodies a command structure to enable this to happen" (Goffey 2008, 17). We might think of computers, then, fundamentally as algorithm machines - designed to store and read data, apply mathematical procedures to it in a controlled fashion, and offer new information as the output. But these are procedures that could conceivably be done by hand - and in fact were (Light 1999).
But as we have embraced computational tools as our primary media of expression, and have made not just mathematics but all information digital, we are subjecting human discourse and knowledge to these procedural logics that undergird all computation. And there are specific implications when we use algorithms to select what is most relevant from a corpus of data composed of traces of our activities, preferences, and expressions.
These algorithms, which I'll call public relevance algorithms, are - by the very same mathematical procedures - producing and certifying knowledge. The algorithmic assessment of information, then, represents a particular knowledge logic, one built on specific presumptions about what knowledge is and how one should identify its most relevant components. That we are now turning to algorithms to identify what we need to know is as momentous as having relied on credentialed experts, the scientific method, common sense, or the word of God.
What we need is an interrogation of algorithms as a key feature of our information ecosystem (Anderson 2011), and of the cultural forms emerging in their shadows (Striphas 2010), with a close attention to where and in what ways the introduction of algorithms into human knowledge practices may have political ramifications. This essay is a conceptual map to do just that. I will highlight six dimensions of public relevance algorithms that have political valence:
1. Patterns of inclusion : the choices behind what makes it into an index in the first place, what is excluded, and how data is made algorithm ready
2. Cycles of anticipation : the implications of algorithm providers' attempts to thoroughly know and predict their users, and how the conclusions they draw can matter
3. The evaluation of relevance : the criteria by which algorithms determine what is relevant, how those criteria are obscured from us, and how they enact political choices about appropriate and legitimate knowledge
4. The promise of algorithmic objectivity : the way the technical character of the algorithm is positioned as an assurance of impartiality, and how that claim is maintained in the face of controversy
5. Entanglement with practice : how users reshape their practices to suit the algorithms they depend on, and how they can turn algorithms into terrains for political contest, sometimes even to interrogate the politics of the algorithm itself
6. The production of calculated publics : how the algorithmic presentation of publics back to themselves shape a public's sense of itself, and who is best positioned to benefit from that knowledge.
Considering how fast these technologies and the uses to which they are put are changing, this list must be taken as provisional, not exhaustive. But as I see it, these are the most important lines of inquiry into understanding algorithms as emerging tools of public knowledge and discourse.
Tarleton Gillepsie, The Relevance of Algorithms, forthcoming, in Media Technologies, ed. Tarleton Gillepsie, Pablo BoczKowski and Kirsten Foot, MIT Press, 2012.
Langlois, Ganaele. 2012. Participatory culture and the new governance of communication: The paradox of participatory media in Television and New Media.
Goffey, Andrew. 2008. Algorithm in Software studies : A Lexicon, ed. Matthew Fuller, 15-20. Cambridge, MA : MIT Press.
Light, Jennifer. 1999. When computers were women in Technology and Culture, 40 (3): 455-483.
Anderson, C. W. 2011. Deliberative, agonistic, and algorithmic audiences : Journalism's vision of its public in an age of audience in Journal of Communication, 5: 529-547.
Striphas, Ted. 2009. The Late Age of Print : Everyday Book Culture from Consumerism to Control. New York : Columbia University Press.