In lieu of an abstract, here is a brief excerpt of the content:

Reviewed by:
  • Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble, and: Pattern Discrimination by Clemens Apprich et al.
  • Tom Welch (bio)
ALGORITHMS OF OPPRESSION: How Search Engines Reinforce Racism
by Safiya Umoja Noble
New York University Press, 2018
256 pp.; paper, $28.00
PATTERN DISCRIMINATION
by Clemens Apprich, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl
University of Minnesota Press, 2018
124 pp.; paper, $25.00

Algorithms of oppression and pattern discrimination are two works that grapple with the disjuncture between big data's supposed objectivity and the very real ways in which it discriminates on the basis of identity. Specifically, both books build on the idea that the imagined objectivity of big data and search algorithms masks the fact that human beings compile, read, sort, and interpret data and that these interactions reflect the implicit and explicit prejudices and values of the society in which they live. Both books are also attempts at using humanist disciplines to question and contextualize the practices and assumptions of information and computer sciences. While Safiya Umoja Noble merges her background in library and information science with critical theories of race, gender, and technology, the authors of Pattern Discrimination use English and cultural studies theories to inform the methods of discerning patterns used in computer science. But while Noble's Algorithms of Oppression is interested in the material realities and consequences of Google as a company, Pattern Discrimination is much more theoretical and general in its approach. Because of these major differences, the books serve separate but overlapping audiences: Algorithms of Oppression is useful primarily for researchers who are interested in studying search engines such as Google and their relationships to race, gender, capital, and the state, while Pattern Discrimination is more useful for providing theories and methodological frameworks for dealing with big data's social value more generally, even as the authors often tie these broad theories to questions of identity.

Safiya Umoja Noble's Algorithms of Oppression argues that the algorithmic interpretation of big data, especially as it relates to search, creates new technological structures of racism and sexism that both reflect and reinforce the prejudices of the people who create those algorithms as well as those who use them. Using Google as her primary case study, Noble attempts to "further an exploration into some of these digital sense-making processes and how they have come to be so fundamental to the classification and organization of information and at what cost" (2). The book is structured as a series of thematically linked but independent case studies of algorithmically informed racism and sexism. At the same time, Noble recognizes that the challenge in writing a book about the internet is that the internet will change almost as soon as the book is written. Therefore, she stresses that the key takeaway from her case studies is to document a continuous and historical trend of "algorithmic oppression" that shows that such cases are fundamental to the structures [End Page 72] and codes of the web and not glitches or outliers, as Google often claims. As a result, the book is structured not chronologically but thematically, with each of the six body chapters dealing with the intersection of Google's algorithm with different aspects of public life and structures of power.

Chapter 1, "A Society, Searching," deals with the searches themselves and grapples with questions about the corporate control of information. Importantly, rather than perpetuating the common concern that racist and sexist Google searches are a consequence of racist and sexist Google users, Noble points out that the infrastructure of Google's search itself engenders problematic results by pushing them to the top. The chapter questions whether one ought to trust private corporations such as Google to be information curators. Chapter 2, "Searching for Black Girls," considers Google's role in the perpetuation of stereotypes and oppression through search results, noting that searching for "black girls" and other racialized keywords usually results in negative and often pornographic results. Searching for whiteness, on the other hand, will often lead to stock photos and other "unmarked" results. Chapter 3, "Searching for People and Communities," continues this logic with a specific case study...

pdf