- Big Novels/Big Data
What is the state of the novel in the age of big data? Sometimes, it’s just a question of scale. Novels have much to say about human experience in the midst of a medial shift from the contained codex to the never-ending World Wide Web. Increasingly, it seems, they speak not just in words on the page but through the sheer number of pages they contain. New work by such writers as William T. Vollmann and Karen Tei Yamashita take up lots of space on the shelf, but none quite expresses the commitment to bigness as Mark Z. Danielewski’s ambitious promise to publish an epic twenty-seven volume serial narrative wherein each book is itself a tome. As the earlier reviews indicate, the first and second books in Danielewski’s new Familiar series clock in at over 800 pages—with a third volume of equal immensity due later this year.
In short, the trend towards bigness in bookish bulk is about building the novel to scale in an age of big data. But big novels are, of course, nothing new. From the eighteenth and nineteenth-century tome to the heavy hitters of the twentieth-century experimental novel—a lineage including Fielding, Melville, Faulkner, and Stein as much as Joyce, Pynchon, Silko and Wallace—the novel takes up space. Confronted with this older legacy, twenty-first century maximalism must respond to a critical question: is the contemporary trend towards bigness an ironic or expected outcome of the print novel in the moment of the book’s supposed obsolescence due to digital technologies?
I would venture to say, yes. The novel has always been a material artifact but this fact is evermore apparent and of increased aesthetic interest in the face of seemingly disembodied digital data. That is, the big book counts its heft as part of its signification, occupying literal space on the bookshelf where its physicality plays an equally vital role in its meaning-making.
Literary criticism is also going big. Inspired by computational practices of textual analysis and digital visualization tools, literary critics are exploring “distant reading” (see Franco Moretti) and moving from close reading small objects (a poem, a passage, a theme) to analyzing big data sets of all sorts of literary corpora (the titles of all novels published during a certain time period). It’s an interesting time for literary analysis, when ideas about what counts as reading and how we do it are shifting along with our reading devices and tools.
In this moment, the bigness of the print-based and bookbound novel registers particular significance. Both in cahoots with big data and also rebelling against it, the big novel accordingly mimics information overload even as it seems to contain the entire world (and, often, the World Wide Web) within it. Resolute in its bigness, such books proclaim that the novel is here…and it’s not going anywhere. [End Page 14]
Jessica Pressman is Assistant Professor of English and Comparative Literature at San Diego State University, where she also directs SDSU’s Digital Humanities Initiative (dh.sdsu.edu). She is the author of Digital Modernism: Making It New in New Media (2014), co-author, with Mark C. Marino and Jeremy Douglass, of Reading Project: A Collaborative Analysis of William Poundstone’s Project for Tachistoscope (2015), and co-editor, with N. Katherine Hayles, of Comparative Textual Media: Transforming the Humanities in a Postprint Era (2013).