- Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin
That digital code—the interminable combination of binary numbers underwriting our high-tech age—can be apolitical, unbiased, and colorblind is a techno-utopic fantasy undercut by decades of data-driven and encrypted inequities. In Race After Technology, Ruha Benjamin analyzes the mechanisms behind a digital caste system that she calls the "New Jim Code": the reproduction of historical forms of discrimination by modern technologies that are perceived and promoted as objective or progressive. Benjamin considers the ambitions and methods of a wide range of programmers and initiatives, including some with democratic aims. And yet, as she argues, even "technical fixes" to systemic inequalities in housing, education, healthcare, and policing lead, very often, to more insidious forms of racism, insidious in that the perpetrated wrongs become harder to recognize. Marketing algorithms, for example, direct merchandise, real estate, and services based on "ethnic preferences" that reify class and race divisions. Crime prediction software intended to offer greater degrees of objectivity creates justifications for tracking and surveilling communities of color. And "neutral tools," like the ubiquitous voice of Siri, uphold whiteness and the white aesthetic as norms.
For Benjamin, it is impossible to understand how "biased bots, altruistic algorithms, and their many coded cousins" produce inequity without recognizing racism as a technology unto itself: a tool designed to "stratify and sanctify social injustice as part of the architecture of everyday life" (p. 17). This is to say that racist outputs of algorithms designed to track, predict, and persuade are an inevitable consequence of the social context in which these formulas and procedures have been designed. Bringing the perspectives of "critical race studies" to bear on science and technology studies (STS), Benjamin aims to offer a corrective to the putatively post-racial techno-determinism of Silicon Valley and of much STS scholarship (p. 41). In the process, she reminds us that racism thrives by shape-shifting. Her chapters explore "the avantgarde stylings of NextGen Racism"—a truly wide range of encoded discrimination—in the hopes that we might address these in a timely and practical manner (p. 46).
Chapter 1 examines robots, or technology capable of machine learning. Benjamin calls the diverse array of artificial intelligence utilized across a rapidly increasing number of sectors "humanity's finest handiwork" (p. 54). And yet, as her chapter reveals, the datasets robots use—that is, the raw constituents of their machine learning—turn out to be riddled with the racial, economic, and gendered biases of their programmers and programming institutions, and therefore the decisions robots make (their predictions, recommendations, and outcomes) reproduce longstanding cultural prejudices and hierarchies. Benjamin points to Beauty AI as a paradigmatic example. In 2016, an international team of programmers trained robots to assess the attractiveness of over 6,000 contestants from about 100 countries based on a lengthy set of supposedly objective parameters. In the end, Beauty AI overwhelmingly preferred white faces (88 percent of the time). Only one person with "visibly dark skin" placed as a finalist in all the age categories taken together (p. 50). The real surprise is not the outcome of this "first ever beauty contest judged by robots" but the gross naivete of everyone involved—the designers, biogerontologists, and data scientists—especially considering that the robots were trained to recognize skin color (p. 49). While Benjamin offers example after example of unintentional coded bias in this and other chapters, she stops short of calling out the tech sector and its numerous eager partners (in healthcare, private industry, policing, and government) for what comes across, at least to my mind, as a staggering degree of ignorance about how social pathologies like racism, misogyny, and xenophobia permeate various structures—psychological, institutional, and architectural. Race After Technology provides a de facto argument, while not the [End Page 236] expressed intention of its author, in favor of the humanities; the programmers and projects Benjamin profiles, those charged with imagining Human 2.0, seem to lack a sufficiently nuanced and intelligent grasp of personhood in its present...