publisher colophon

Recounting her experience at the 2008 annual conference of the American Studies Association, Tara McPherson observed that attendees were suspicious of computational work. They "perceiv[ed] it to be complicit with the corporatization of higher education or as primarily technological rather than scholarly," she argued.1 Arguments continue to circulate that computational work—whether in the form of articles, digital projects, or tools—is merely technological positivism, often in the service of neoliberalism. Daniel Allington, Sarah Brouillette, and David Golumbia's polemic in the Los Angeles Review of Books, for example, argues that digital humanities is actively facilitating the neoliberal takeover of the university by supporting a field that is simply training students in technology skills for industry. Such articles have turned into a cottage industry, too often ignoring the critical lens with which many scholars engage in this work while perpetuating a myth that only specific areas of inquiry are inculcated in or resisting neoliberalism's logic.2 The perniciousness of such narrow critiques is augmented by how these articles render invisible the important work, particularly by women and scholars of color, to use and critique computational techniques to study topics such as race, gender, and even neoliberalism itself. This is not to say that hesitation or critique is unwarranted, but the minimal engagement with and often quick dismissal of computational techniques elicit concern about even experimenting with computational analysis.3 Such a response is even more surprising from a field so often open to methodological change.

The tide is shifting.4 Scholars are engaged in developing a critical computational humanities shaped by American studies. Computation offers a methodology for studying culture. At the same time, questions about culture provide a critical lens through which to analyze computation. Such generative tension is resulting in exciting work at the intersection of the two fields. But first, let us define our terms.

What does it mean to say computation and computational humanities? Computation is the process of using computers to calculate an output given [End Page 633] an input. The breadth of such a process includes computing at a multitude of scales from counting in a program like Excel to running a task that requires high-performance computing. As Stephen Ramsay argues, computers are designed for "enumeration, measurement, and verification" and provide a powerful analytic tool when paired with critical inquiry.5 The term computational humanities has emerged to suggest the use of computing for large-scale analysis of humanities data. Processing data that exceed what individuals can analyze on their own involves a form of counting or the use of an algorithm through a tool or programming language. While often commonly positioned at the intersection of computer science and digital humanities, computational humanities engages with other fields including data science, (computational) linguistics, and statistics.6 Such a transdisciplinary approach creates "a digital ecology of data, algorithms, metadata, analytical and visualization tools, and new forms of scholarly expression that result from this research," as Christa Williford and Charles Henry, of the Council on Library and Information Resources, write.7

Text analysis, particularly the method of topic modeling, has enjoyed broad exposure within computational humanities. However, areas such as spatial analysis, network analysis, and image analysis are becoming increasingly prominent as scholars question the continued focus on text. This shift is being driven by scholars from fields like American studies, who have long argued that to understand culture requires analyzing forms other than text and word culture, and responded by developing methods in conversation with fields like material culture studies, sound studies, and visual culture studies. The result is a growing number of terms used to describe these approaches such as cultural analytics, distant listening, distant reading, distant viewing, and macroanalysis.8 All signal computational analysis at scale of different kinds of corpora including images, time-based media, sound, and text.

Data. Scale. Distance. Algorithm. Speed. Fast-forward a decade, and these terms continue to cause significant uneasiness among humanists, particularly among American studies scholars. Computational humanities has been at the center of digital humanities debates in fields like literary studies, yet remains seemingly peripheral to our field. This is intriguing given that some of the most prominent scholars and projects engaged in the computational humanities identify as Americanists in conversation with or explicitly as American studies. However, concerns remain that computation introduces claims to empiricism that are fundamentally at odds with the theories that undergird our field.

Yet, as Ryan Cordell counters, digital humanities in America studies is "seeing projects bloom that defy any dichotomy—offered in praise or condemnation [End Page 634] —between empirical or theoretical analyses."9 Instead, the strength of an American studies computational digital humanities is how our field necessitates that critical questions about issues such as race, gender, and power shape our objects of study and the applied methodologies, including questioning and remaking the very computational logic that makes computational humanities possible. To illustrate this point, we turn to three examples that experiment with computational methodologies while drawing on American studies efforts to expand which forms of culture should be studied.

Directed by Ryan Cordell and David Smith, Viral Texts explores reprint culture in nineteenth-century newspapers. Given the scale of the corpus, computational methods were used to identify reprinted texts in 41,829 issues.10 The goal of the project "is not to construct a definitive, empirical solution to the problem of nineteenth-century newspaper reprinting," Cordell writes, "but to facilitate an iterative conversation between the large-scale, quantitative output generated by a corpus analysis algorithm and qualitative, literary-historical readings of the surprising texts that algorithm brings into focus."11 Through this approach, the project is challenging the literary canon by revealing viral texts that have gone unnoticed. For example, the poem "Beautiful Snow" has not received scholarly attention despite appearing in more than 276 periodicals and being the most reprinted poem in the Viral Texts corpus. The poem, as Cordell and Abbey Mullen argue, offers a lens into circulating ideas about gender and sexuality. The project is using computation methods to see patterns in the archive that are leading to a new bibliography of popular literature. Such a bibliography allows scholars to see the ideologies and values that shaped the era anew.

Developed by the University of Richmond's Digital Scholarship Lab (DSL), Forced Migration of Enslaved People, 1810–1860 uses spatial analysis to show how enslaved people of color were forcibly moved between the passage of the Act Prohibiting Importation of Slaves in 1807 and the Civil War. The DSL used census data, county area, and assumptions about population growth rates to calculate in-migration and out-migration of enslaved peoples. Over half a century, nearly a million enslaved people were relocated. While the general directionality from slave-exporting states like Maryland and Virginia to importing states in the southeast like Mississippi, Louisiana, and Texas has been apparent, Forced Migration offers a much more granular portrait of the subtle contours of the domestic slave trade. It illustrates the spatial process where, within the span of a generation, slaveholders in particular areas that had purchased thousands of enslaved peoples became sellers on the slave market, arguably [End Page 635] because the growing enslaved population met and then exceeded the labor needed to cultivate cotton in those areas. These arguments are made through visualizations like maps and plots on a public, digital platform. Computational methods make possible new scholarship about the entanglement of capitalism and the institution of slavery through digital forms of knowledge production.

Pattern recognition similarly shapes the Distant Viewing (DV) project, led by me and Taylor Arnold, a statistician. Suggesting a shift from distant reading, the project is part of a growing chorus of such work that argues that DH needs to expand beyond text to other forms such as photography and moving images, a shift that American studies has also called for.12 DV, therefore, focuses on how a critical use of computer vision can be used to analyze moving image culture. Since the majority of the algorithms are trained on twenty-first-century data held by companies like Google and platforms like Flickr, we need to question and adapt these algorithms using machine learning informed by our areas of inquiry. We are building an open source software library—Distant Viewing Toolkit (DVT)—to facilitate the algorithmic production of metadata summarizing the content (e.g., people/actors, dialogue, scenes, objects) and style (e.g., shot angle, shot length, lighting, framing, sound) of time-based media.

A subset of the project called Distant TV applies these methods to the study of US television series. In collaboration with Annie Berke and Claudia Calhoun, the project analyzes how visual space is used by characters in over fourteen sitcoms from the network era of American television (1952–85).13 For example, we used DVT to identify the location of main characters of Bewitched and I Dream of Jeannie—Samantha and Jeannie, respectively—in each frame and scene breaks during the 1966–67 season. DVT yielded over one million detected faces and nearly five thousands shots during one season of the two shows. These elements were then used to analyze the placement and patterns of the main characters of Bewitched and I Dream of Jeannie. Whereas Samantha is rarely absent from the show for more than a single scene, Jeannie is often absent from significant portions of an episode. Such patterns of (in)visibility become interesting when one considers how the scholarship has argued that the two shows functioned similarly culturally. How might such divergent formal patterns offer insight into how these shows' representational politics might have actually differed and shaped their cultural messages? While this research is preliminary, the ability to view such patterns at scale and place this in conversation with other methods such as close reading offers an expanded methodological toolkit for American studies to draw on.14

At the same time, bringing a critical lens to the computational humanities is vital. As new media scholars such as David Berry argue, software and therefore [End Page 636] computation undergird the very way we create knowledge.15 Golumbia shows that there is a cultural logic to computation, while Wendy Hui Kyong Chun reveals how ideas of "programmability" extend well beyond the computer into the sociocultural sphere.16 Such scholarship is augmented by the emerging field of critical code studies that close-reads software to see how it is shaped by and circulates particular cultural logics.17 American studies is building off this inquiry into the logics of computation and code as led by such scholars as Elizabeth Losh, Lev Manovich, Tara McPherson, and Lisa Nakamura. Jessica Marie Johnson and Mark Anthony Neal join this exciting work by calling for a black code studies that "centers black thought and cultural production across a range of digital platforms."18 This call charges that the field ask how the digital is harnessed to challenge systems of oppressive power, constitute communities of support and resistance, and imagine a radical future. Such an approach offers a guide for one way that American studies can challenge and shape computational humanities.

American studies brings a critical approach to its methods, continually asking scholars to question how their approach to their object of study may reinscribe the very problematic cultural and social relations the field works to reveal. One only has to turn to Janice Radway's 1998 ASA presidential address, where she questions how the very name of the field builds boundaries and risks foreclosing areas of inquiry, to see how this critical reflexivity is constitutive of the field.19 Therefore, it is no wonder that a method such as computational analysis elicits a critical response. And this is precisely why American studies needs to engage with computational humanities, for it has broader implications for our field and digital humanities. As Miriam Posner argues, "We cannot allow digital humanities to recapitulate the inequities and under-representations that plague Silicon Valley; or the systematic injustice, in our country and abroad, that silences voices and lives."20 American studies—with its history of questioning economic, political, and cultural structures—is well positioned to engage in computational work to remake its logic and its tools.

"Will computational humanities remain the exclusive domain of private companies and government agencies?" asked Manovich in 2012.21 Our response has and should continue to be a resounding no. What might mapping film distribution networks tell us about cultural imperialism and resistance? What could analyzing decades of radio audio reveal about the soundscape of US culture? How might we use social media data to understand social formations? How might we develop American studies–informed algorithms? What might computational analysis of our scholarship tell us about our very field? Let's find out! [End Page 637]

Lauren Tilton

Lauren Tilton is assistant professor of digital humanities in the Department of Rhetoric & Communication Studies and research fellow in the Digital Scholarship Lab at the University of Richmond. Her research focuses on twentieth-century US visual culture. She is a codirector of Photogrammar, a digital public humanities project mapping New Deal and World War II documentary expression, and coauthor of Humanities Data in R: Exploring Networks, Geospatial Data, Images, and Texts (Springer, 2015).


I would like to thank Taylor Arnold, Amy Earhart, and Robert Nelson for their generous feedback on this essay.

1. Tara McPherson, "Why Are the Digital Humanities So White? or Thinking the Histories of Race and Computation," in Debates in the Digital Humanities, ed. Matthew K. Gold (Minneapolis: University of Minnesota Press, 2012),

2. For examples, see Daniel Allington, Sarah Brouillette, and David Golumbia, "Neoliberal Tools (and Archives): A Political History of Digital Humanities," Los Angeles Review of Books, May 1, 2016,; Carl Straumsheim, "Digital Humanities as 'Corporatist Restructuring," May 6, 2016,; Timothy Brennan, "The Digital-Humanities Bust," Chronicle of Higher Education, October 15, 2017,; Sarah Bond, "How Is Digital Mapping Changing the Way We Visualize Racism and Segregation," October 20, 2017,; Richard Grusin, "The Dark Side of the Digital Humanities—Part 2," Thinking C21, January 9, 2013,

3. McPherson, "Why Are the Digital Humanities So White?"

4. Here I would like to acknowledge the work of scholars like Elizabeth Losh, Tara McPherson, Miriam Posner, and Jacqueline Wernimont, who have been foundational to an American studies digital humanities. The American Studies Association's Digital Humanities Caucus has also been working to bring to the fore DH scholarship.

5. Stephen Ramsay, Reading Machines: Toward an Algorithmic Criticism (Champaign: University of Illinois Press, 2011).

6. Chris Biemann, Gregory R. Crane, Christiane D. Fellbaum, and Alexander Mehler, eds., "Computational Humanities—Bridging the Gap between Computer Science and Digital Humanities," report from Dagstuhl Seminar 14301, July 20–25, 2014,

7. Christa Williford and Charles Henry, "One Culture Computationally Intensive Research in the Humanities and Social Sciences: A Report on the Experiences of First Respondents to the Digging into Data Challenge," Council on Library and Information Resources, 2012,

8. Taylor Arnold and Lauren Tilton, "Analyzing Moving Image Culture," Distant Viewing (blog),; Tanya Clement, "Distant Listening or Playing Visualisations Pleasantly with the Eyes and Ears," Digital Studies / Le champ numerique (2013),; Matthew Jockers, Macroanalysis (Urbana: University of Illinois Press, 2013); Lev Manovich, "The Science of Culture? Social Computing, Digital Humanities, and Cultural Analytics," Journal of Cultural Analytics, May 23, 2016,; Franco Moretti, Distant Reading (London: Verso Books, 2013).

9. Ryan Cordell, "A Larger View of Digital American Studies," Amerikastudien / American Studies 61.3 (2016),

10. David A. Smith, Ryan Cordell, and Abby Mullen, "Computational Methods for Uncovering Reprinted Texts in Antebellum Newspapers," American Literary History 27.3 (2015): E1–E15,

11. Cordell, "Larger View of Digital American Studies."

12. Charles R. Acland and Eric Hoyt, eds., The Arclight Guidebook to Media History and the Digital Humanities (Falmer: REFRAME Books, 2016); Tanya Clement, "Multiliteracies in the Undergraduate Digital Humanities Curriculum," in Digital Humanities Pedagogy: Practices, Principles, and Politics, ed. B. Hirsch (London: Open Book Publishers, 2012); Manovich, "The Science of Culture?,"; Miriam Posner, "Digital Humanities and the Allure of the Absurd," InMediaRes, April 15, 2013,

13. For more on Distant Viewing, see

14. Miriam Posner argues that we bring nuance to representing gender and race rather than create a simple data structure that reinscribes often violent simple characterizations of these same categories. While we may not be entirely successful, we are trying to keep this important call to challenge, not replicate through DH, these very problematic and often oppressive organizational logics that have been used to structure people's lives. Lauren Klein echoed Posner's argument in her presentation at MLA, where she addressed how to approach distant reading in light of recent revelations about Franco Moretti. She argues that interrogating power is critical to restructuring distant reading so that the method does not replicate structural inequality. See Klein, "Distant Reading after Moretti," presentation, Modern Language Association annual convention, 2018; Miriam Posner, "What's Next: The Radical, Unrealized Potential of Digital Humanities," in Debates in the Digital Humanities 2016, ed. Matthew K. Gold and Lauren F. Klein (Minnesota: University of Minnesota Press, 2016).

15. David M. Berry, "The Computational Turn: Thinking about the Digital Humanities," Culture Machine 12 (2011),

16. David Columbia, The Cultural Logic of Computation (Cambridge, MA: Harvard University Press, 2009); Wendy Chun, Programmed Visions: Software and Memory (Cambridge, MA: MIT Press, 2011).

17. Mark Marino, "Critical Code Studies," electronic book review (2006).

18. Jessica Marie Johnson and Mark Anthony Neal, "Introduction: Wild Seed in the Machine," Black Scholar 47.3 (2017): 1–2.

19. Janice Radway, "What's in a Name? Presidential Address to the American Studies Association, 20 November, 1998," American Quarterly 51.1 (1999): 1–32.

20. Posner, "What's Next."

21. Lev Manovich, "Computational Humanities vs. Digital Humanities," Software Studies Initiative (blog), March 16, 2012,

Additional Information

Print ISSN
Launched on MUSE
Open Access
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.