New Stationary States:Real Time and History’s Disquiet
The problem of time, its tools, and its functions, inevitably looms behind all the enterprises in which it is engaged. One may not be interested in it; it will soon establish itself in the foreground. One may pretend to ignore its presence; it nevertheless remains present behind all the words which one employs.
The Age of Expansion
In that altogether too famous essay often described as his “essay on the memex,” Vannevar Bush, director of the wartime United States Office of Scientific Research and Development, observed that “[i]f the aggregate time spent in writing scholarly works and in reading them could be evaluated, the ratio between these amounts of time might well be startling.”1 Noting a “mountain of research” now growing faster and out of all proportion to scholars’ ability to collectively absorb what they collectively produce, Bush described the knowledge worker “staggered by the findings and conclusions of thousands of other workers—conclusions which he cannot find time to grasp, much less to remember, as they appear.” “Yet specialization,” Bush continued, in a characteristically terse expression of what we might call the antinomy of the technocratic imperative, “becomes increasingly necessary for progress, and the effort to bridge between disciplines is correspondingly superficial.”
The baldly temporal problem of finding human time to read, and thus determine a human use for what mechanically extended researchers can [End Page 179] produce, has been with us in the United States for a long time, and longer elsewhere. Without doubt, it has been aggravated and refined in specific ways, by four decades of relative economic contraction and the further subdivision of labor in the U.S. university that accompanied its unabated, if largely financialized expansion after the world economic crisis of the 1970s. It is equally without doubt that its roots lie in the long American nineteenth century, with industrial revolution, manifest destiny, and mechanized civil war. But in the form in which it affects us most acutely, in 2012, the temporization of research—by which I mean its improvisation, its deferral, and its truncative presentism or “trimming,” all at once—might be said to derive from the scale of science applied in the second and final great war, the one that generated the episteme indexed by Harold Innis’ apothegm “The interest in post-war problems is the post-war problem” (1946, 56).
In his author’s foreword to Giles Goat-Boy, the novel he began while teaching at Pennsylvania State University in the late 1950s and early ’60s, John Barth described an “epidemic of academic gigantism” beginning with that war and multiplied by the Sputnik crisis, in “a massive effort to ‘catch up,’ fueled by an inpouring of federal money that would fertilize the groves of Academe right through the Sixties” (1987, v). By most accounts, the literary humanities in the U.S. came comparatively late and gradually to this hyperproduction party. But it is unquestionable that the late-blooming and sometimes necessarily “silent” work of cumulative human wisdom has been nothing less than thoroughly colonized, now, by the disjunctive form of the scientific breakthrough noisily achieved in relative youth, and that this has decisively transformed the critical climate in which we operate. Indeed, we might well describe our chronic overproduction of research monographs, today, in their displacement of the critical essays of yore, as a refraction of that oblique appropriation of technoscience, through which a discipline practically weakened by the war relinquished the only advantage it retained over its triumphant rival: the not at all useless, and not at all publicly scorned normativity of critical vision in a universe of amorally mechanized research.
To the extent that a newly arrived and thoroughly wonkish new or digital media studies is today the latest and greatest white humanist hope for salvation through grant funding and facility in technical administration, it might prompt us to consider, once again, the tragedy of Marshall McLuhan, a writer and thinker I admire and take seriously, against the grain of the dominant cultures of literary humanism and antihumanism even today. By “the tragedy of McLuhan” I mean not so much McLuhan’s steadfast and extreme refusal of proper academic discipline for satire and self-satire, in what his most sensitive reader, Glenn Willmott, has called “symbolic self-sacrifice to the problem of the critic itself” (1996, 207). I mean rather McLuhan’s embrace of discipline at its absolutely most typical extreme, in the hyperexpansive hyperactivity with which McLuhan the English professor traveled the knowable world seeking to expand it to his own dimension, in a denial of finitude producing interlocking epistemic and corporeal consequences. Postwar literary studies’ [End Page 180] first and greatest media guru would die of elected overwork, expiring before the age of seventy, disabled by stress- and travel-aggravated congestive heart failure, stroke, and at the very end, the horrifyingly overdetermined torture of total aphasia, a punishment worse than death for a graphomaniac and a great conversationalist.
That the University of Toronto shuttered McLuhan’s Centre for Culture and Technology before the guru was even in his grave might prompt us to consider a warning of Norbert Wiener, who, unlike McLuhan, was a genuine technocrat at the genuine center of the postwar new world order: that “We must value leisure.” Delivered with a suggestion that the new postwar order was “going to be a difficult time,” and that the new applied scientists may well “deserve the punishment of idolators,” Wiener’s lament concludes with the observation that “the medieval attitude is the attitude of the fairy tale in many things, but the attitude of the fairy tale is very wise in many things that are relevant to modern life” (1954, 28). In Wiener’s own mind at least, this threat served as stick to the carrot of a project truly worthy of the human name:
We must make a great many changes in the way we live with other people. We must value leisure. We must turn the great leaders of business, of industry, of politics, into a state of mind in which they will consider the leisure of people as their business and not as something to be passed off as none of their business.(1954, 26-27)
A sympathetic critical reader of McLuhan who is also a reader of human life in modern human institutions might have a hard time resisting the thought that McLuhan, reflecting at the end on all he had done that was worthwhile, must have sensed also that much of it simply had not needed to be done, in a world that really was historically prone to involution. By involution I mean “implosion,” as McLuhan imagined it, in that strain of his work most deeply derived from Innis: not the euphoric global village, but the heat-death of the Western empires, an entirely earthly and sensible challenge to modernity conceived productivistically as an irreversible fait accompli.
The fate of McLuhan’s attempt to track and mirror that devolution, in his work and activities through the upheaval of the North American 1960s, vividly illustrates the plight of the critic and scholar of contemporary culture, who is driven to aspire to “real time” intellectual response at a level more substantive, and so time-consuming, than her counterparts in the cultural public sphere. This is a term subject to a certain semantic drift. In technical and business computing, a “real time” system responds to instructions within a bounded time—its “bounds” being the maximum processing delay, or interval between instruction and response, acceptable for defining usability. In systems design, real-time computing marks the end of the era of computer “programming” as such (that is, asynchronous batch-processing of instructions). In marketing, it articulates a dream of access to consumer behavior virtually as it occurs; in banking, it marks usable human access to [End Page 181] both human-directed and automated market behavior. A real time system may or may not be fast, depending on how fast a response is desired; the necessity to take interactive input from a user leaves it necessarily inefficient in any case. Its function is to create a human-recognizable virtual world duplicating, in whatever particular aspects are desired, the phenomenal world of lived time, as rapidly and consistently as it can be recognized. Since the ideal bound on response is human-recognizable instantaneity, real time is paradoxically a form of finitude, limited by human cognition, but ideally immediate, non-mediated, within that finite bound. In the frisson of knowledge it feeds us, the liveness of real time appears to reflect, for the cognition monitoring it, the objective autonomy of a self-organized system.2 In that aspect, more than anything else, it marks the desire of the cultural critic for immersion in the cultural present modulated by retained critical distance. It is, in other words, a simulation of nonmediation, affording the critic intimate contact with the vitalized transport of the artist’s transformative, transcendentally immersive primary creative productivity, while securing criticism from contamination by it. In developing a social history of visual broadcasting, Raymond Williams called this tele-visual recognition “planned flow”: as good a name as any for Williams’ own mode of engaging culture, as an adamant critical modernist, over-recognized for his intellect, who wrote under-appreciated novels, short stories, and plays on the side.3
The drives invested in real time certainly lead one way to a fantasy of absolute control, in the prosthetic slave of the computer, a laborer requiring no reproduction, responding to instructions without delay and working around the clock. But in its captivation of the scopophilic surveillant user, real time simulation also taps a more conflicted, profoundly ethnological desire to penetrate the field of “field work” without incurring the risk of being converted: the risk of never returning to the disciplinary home. In its simultaneous aggrandizement of distance and proximity, the double bind of real time simulation is nowhere more piquantly radicalized than in climate modeling, a computational science of imagined time scales requiring massive inputs of computing power.4 The material economy of such work in supercomputing offers an encounter with the proleptic modeling of potentially catastrophic change as an activity with potential impacts on the conditions of study itself—or as a bound on the reproduction of conditions of study, as we know them today. The real scale of these operations, on any properly dissected contemporary research campus, is more than safely concealed [End Page 182] from humanists who might complain that the literary imagination itself can certainly tell us what to expect from such expenditure, if not to anticipate its local colors. And yet the everyday media of what is often called “Web 2.0” offer convenient access to downscaled data visualization applications such as the Melbourne-based digital artist David Bleja’s captivating Breathing Earth, as paralyzing an armchair education in energy globalization as one could want. Understood as a program, first of all—a series of written instructions—Breathing Earth is truly a work of world literature in the most generous sense, in so far as to prod and skew it with one’s screen pointer is to be driven to reflect on the sustainability of practices of modeling time, in themselves, even (or especially) when we model our own choking to death.5
It is a question not only of the sustainability of particular material practices, relative to and as alternatives to others, but of the sustainability of a whole way of life and a particular civilization, as it is now administered. In temporizing the peripheralization of the Mediterranean in an early modern reorientation to the European Atlantic, Fernand Braudel had hoped “to illumine our own century…in that ‘utility’ in the strict sense which Nietzsche demanded of all history,” permitting reflection on “the painful problems of our times” (1995, 19). Such was also the motive of Janet L. Abu-Lughod, who suggested that the future of Western sociological self-understanding, as much as of its object of study, rests with our proleptic powers of imagination of systemic decay. In the half-light of a United States inheritance of Euro-Atlantic power understood as incipient waning after the triumph of 1945, this object of study is a spectral cultural afterimage of the Western empires and, as such, an inverted mark of the world preceding a world Europe, a world containing “no single hegemonic power.” But of such an end to the story of European hegemony as the end of the story, as Abu-Lughod wisely observes, our structures of feeling and institutionality alike have little to say, even today:
[A] theory of systemic change should be able to account for system decay as well as system growth. This, however, is not as easy as it sounds. In tracing the development, expansion, and greater connectivity of a system, there is a natural tendency to concentrate selectively on those things that increase “systemness.” No such natural principle of selection, however, is available to scholars trying to analyze the decline of a system. It is thus easier to account for positive changes than for negative ones.(1989, 4) [End Page 183]
The possibility of, indeed, the mandate for, speculation on “the impermanence of all systems” (370) stands for the non-memorable, unthinkable forgetting of modernity—indeed, perhaps for a thinking of modernity as the fabricated object of an inter- and anti-institutional critical modernism as something that never in fact really settled itself. Far from an apocalyptic vision, in the structural panic of a civilization faced with the unbelievable disenchantment of its belief, such forgetting perhaps need be taken only, and only so far, as a lesson in routine stasis.
We might then think our critical climate today as a contested site of critical modernity, precisely where its question might most readily assumed to be settled: in the brute historical fact of the computer and the history-ending historicity of brute computation, in the newly digital humanist’s working office. Saree Makdisi, for example, suggests that Blake’s legacy asks us to consider “imperial investment in human energy” and how such profoundly consumptive investment structures history itself and its periodization in scholarship (2011, 320), while Imre Szeman notes that rather than challenging the “fiction of surplus,” literature bolsters it like any other presentist social narrative. “Ever more narrative, ever more signification, ever more grasping after social meaning,” Szeman reminds us: “what literature shares with the Enlightenment and capitalism is the implicit longing for the plus beyond what is.” Literary studies as we know it, Szeman suggests, was built over a “foundational gap to which we have hitherto given little thought”: the “epistemic inability or unwillingness to name our energy ontologies,” so that “we can continue to be who we are now” (2011, 324-325).6
To the humanities, at least, new media seem to offer the broadest form of salvation, in the grant-fundable laborative scientism that might usher us out of the labyrinth at last, into active administration of the means of knowledge production. The precipitous hope invested, yet again, in even those genuinely insightful and deserving recent productions of a nascent post-humanist literary-critical episteme, tells us something of the absolute desperation of an encircled humanism, unwilling even to wager that an administrative technomodernity now suicidally doubling down on every one of its fruitless mandates could ever provoke a civic and civilizational break to the other side. If the world-economic troubles of 2008, as a hemispheric-civilizational crisis of confidence, are to mean anything to critical practice, [End Page 184] they will have to provoke an account of the encroachment of systemic decline and collapse on the critical episteme, in the universality of the United States-modeled university as complexly exploded and imploding agent of academic modernity. The time has come, we might say, for new critical anachronies addressed, from the technocratic horizon of the imminent (and immanent) death of humanist print, to the still more remote and both “human” and post-human extinction of the digital, itself.
The Stationary Age
To ask, in all honest curiosity, what the extinction of the digital would “look like” is of course to demand a picture of the unimaginable. Naturally, I do not mean something like molecular computing as applied science, further advancing the saturation of the lifeworld by a mechanistic ontology ever so strenuously qualified and disavowed by its operators as such. Rather, I mean something like “history’s disquiet,” as Harry Harootunian so elegantly imagined it, in time’s revenge on even the study of time, in itself, as much as on study’s many and diverse spatializing specializations. Drawing on both Fernando Pessoa and Tosaka Jun, Harootunian traces the traces of a catastrophic actuality of everyday life reducing all our “modernological” labors to naught. This haunting or spectralization of the Euro-Atlantic modern present by everydayness, Harootunian observes, “holds its evidence and its force of actuality in its capacity to actualize the promises not kept in the present and to reveal the possibilities for critique and renewal that its negativity has concealed” (2000, 21).7 It marks the non-evitable loss of the scholarly imperium of research through which that modernity maintained and maintains itself, today, as academic capitalism (Harootunian 2004, 397) facing off against Pessoa’s literary heteronyms’ “recognition that nothing lasts” (Harootunian 2000, 2).
Nothing. Not the reduction of human conflict to the soluble dimensions of technical problems, as Joseph Weizenbaum saw it, warning us that “there are some acts of thought that ought to be attempted only by humans” (1976, 13). Not the remodeling of critical inquiry, itself, as measured output, in the pervasive culturalization of computation that now almost entirely [End Page 185] circumscribes contemporary academic intellectual life.8 I am saying that we may as well look ahead, since there is no pleasing our masters, who will make every new level of productivity achieved a standard, until we have collectively gone mad—or are dead9—and for whom the freedoms we imagine for digital scholarship are in the end perhaps merely productive efficiencies.
There is no reason to believe that any new critical turn will prove any less resource-intensive than what preceded it, or that an “entirely” digital publishing infrastructure will consume any less energy than its mechanical or mixed mechanical and digital antecedents.10 Nor is there any reason to believe that the tendency of modern research to what Heidegger called “the industrious activity of mere busyness” (Betriebsamkeit des bloßen Betriebs) (2002, 74; 1957, 97), and Innis “the expenditure of subsidies for the multiplication of facts” (2008, 203), will be any more sustainable in digital media than it was in print—either ecologically or as a cultural assertion of civilizational modernity as fait accompli. To address the ecological impasse we now face is not to demand some productive new critical-theoretical innovation, perhaps, so much as some restraint of mechanized critical and critical-theoretical production, in itself—truly a re-evaluation of ourselves as we are accustomed to work. Raised to the next level, then, the question of sustainability, for the paper-print culture of our work as much as its ecology of arguments and ideas, is the question of the sustainability of the current level of scholarly production, itself, and of the productivism that is its principal driver, irrespective of medium, in “real time” research producing uninterruptedly published results. To the extent that already, in the highly leveraged current system, we scholars are collectively constrained to write much more than we can possibly read, we might well ask ourselves whether what Mike Davis called “the bubble world of American consumerism, as it existed…in 2007,” can ever be restored—or whether “protracted stagnation, not timely technology-led recovery, seems the most realistic scenario” for the years to come (Davis 2009, 40).
Echoing Davis, Gopal Balakrishnan speculates that the historical vitality of capitalism “has depended on a demographic youthfulness…unsustainable over the long term” (2009, 20) with the ecological impasse of the historical present “likely to be the most absolute of all” (21). How, then, might the “stationary state” of secular stagnation to which Davis and Balakrishnan point, as “our” future, bear on its reflection in the material conditions of production of our knowledge of that state, itself? The consequences for the default modality of academic critical modernism, today, are clear: if the “stationary state” is symbologically not the old world of paper, progressively superseded by the new digital media and their administration, but a static [End Page 186] world in which new media cannot save capitalism any more than China, India, Brazil, or Turkey can, then when we speak of the sustainability of our critical climate, we perhaps mean not sustained forward momentum, in growth as constant and violent progress, but little more than keeping things from getting any worse.
What would it mean for the new digital humanists to integrate an account of the eco-systemic impact of computing into their work? When in 2009 Alexander Wissner-Gross, a Harvard-trained physicist running a startup measuring Web sites’ carbon footprints, opined in The Times (UK) that “each Google search generates an estimated 5-10 grams of carbon dioxide,” he found himself swarmed by Google’s vast immune system, producing an Internet news “meme” that eventually branched into publicity for Google’s, Yahoo’s, Amazon’s, and Microsoft’s plans to fuel green server farms with wind turbines and methane from cow manure harvested from dairy farms.11 It is no longer controversial to point to the administration and monetization of conservation itself, in green industry, as an either mindless or entirely strategic check on ecological activists’ goals. Of what he argued was the direct dependence of electronic textuality on strip-mined coal, Wendell Berry has observed, with his characteristically and usefully mordant purism, “I do not see how anyone can…plug in any appliance with a clear conscience” (1990, 177). Facing the tendentious and relentless dialecticalization of the temporization of waiting before the Next, we might say that what we need, now, is a truly regulative ideal, in the non-concept of zero: a cognitive oblivion or knowledge-death as terminus of scholarly hyperspeed, like the ideal of corporeal death that scares us off fast food.
Long ago, Walter Ong wrote of the enormous intellectual strain it takes for a modern literate to imagine a culture in which words have “no visual presence, even when the objects they represent are visual” (2007, 31). More recently, Catharine Diehl (2008) has traced the substitution of a signifying nothing, or positive zero, for an older and naïver historical positivism, in the structuralist moment, and Lydia Liu (2010) has argued the enclosure of structuralism’s “post” in the high-theoretic cybernetic unconscious of the postwar world order of Wiener and Bush. In fact, that non-positivizable zero is always already pursuing us, today, as we write in and on line, and it is hardly “theory” (poststructuralist or otherwise) that drives its archive fever. The non-signifying nothing of a digital finitude or media “degree zero” is the [End Page 187] temporal terminus of the archival death of our files, in software and bit rot, analogous to “language death” in its total removal of a local and temporal episteme from being-in-circulation:12
Except for fields with a tradition of grey literature and preprints—such as physics and economics—there are neither widely shared perceptions of long-term value nor well-established practices for the selection of [new genres of scholarly discourse] for long-term retention…outside of those disciplines (chiefly in natural and social sciences) that have had longstanding traditions of preprints and grey literature, most new forms of scholarship go uncollected and unpreserved.
Even longer ago, Ted Nelson imagined a file structure capable of receiving such change, in accommodation to the erratic but incessant “disappearance and up-ending of categories and subjects” that is the dynamic of knowledge itself.13 And if, for all their allusive literariness, Nelson’s dream machines are perhaps ultimately (and fatally) industrious diagrams of time,14 the radical negation that Nelson was willing to face, in his work, productively haunted it, just because Nelson refused to turn it away. If there is nothing to prevent digital publication from licensing a still more tremendous avalanche of mediocre research in additive and accumulative supplementation, rather than considered revision of older forms, there is really nothing so soundly standing in the way of its inadvertent or contrived deletion, either. If the opportunistic critique of “useless” humanities research now emerging from the academic center-right15 has but one virtue, it is to focus attention from the inside on the productive automatism of mere busyness, which Heidegger told us “must, at all times, be resisted,” if we wish to save modern research from itself:
Constant activity becomes mere busyness when its methodology no longer holds itself open on the basis of an ever new completion of its projection, but rather leaves this behind as something simply given [End Page 188] and no longer ever requiring confirmation; instead, all it does is to chase after results piling on top of each other and their calculation. Mere busyness must, at all times, be resisted—precisely because, in its essence, research is constant activity….[T]he more completely research becomes constant activity and in this way becomes fruitful, the more steadily there grows within it the danger of becoming mere busyness. In the end we reach a situation where the difference between constant activity and busyness [Betrieb und Betrieb] is not only unrecognizable, but has become unreal.
It is not in fact limitless opportunity, but only a radical circumscription, in the creation of limited, rule-governed model worlds, that enables and sustains such mere busyness. Heidegger noted the integral role of the publishing industry in disseminating, mimicking, and in its own way driving this “worlding” conversion of problems to results, through limited publication: narrowing the publishable world through forward projection as the construction of change.16 And it is thanks to that ongoing construction of gratuitous, gratuitously incessant, and incessantly violent change, in capitalism’s circumscribing “creative destruction,” that the permanence of the progressive civilizational legacy of Euro-Atlantic modernity is a question, rather than an answer, for more core subjects of the United States empire, today, than at any time since the 1970s.
Viewed in this light, the salvation of the humanities, in what remains of the United States public eye, at this historical moment, might well lie not in still more strenuously friendly, unpersuasive, and unreciprocated approaches to technoscience, but rather in setting a moral example by purposefully refusing to grow. What will it take, I am asking, to recognize time, not print or digital media, as the medium of research—in so far as time itself brings all worldly striving to extinction?
I insist that the structural irony of voicing this question, as someone personally and structurally committed to life in and the life of institutions, is not simply a contradiction. With masters that cannot be pleased, and little left to lose, now, we might as well insist on the long-durational productivity of waiting for our work. But this need not entail what the historian Arthur Herman, in The Idea of Decline in Western History, superciliously names “cultural pessimism” (1997, 7-10). I have suggested elsewhere that institutionally assertive primary cultural productivity, in the indisciplinary practice of the arts in the university, offers one form of relief from the profoundly needy productivism of research as mere busyness.17 One model for such professional [End Page 189] literary and cultural-critical temporization, in the “new stationary states” to come, might be found in a now widely proposed, if nowhere enacted “revaluation of the essay” and of a certain essayism18 against a fetishization of rigor as bulk, rather than depth of thought. But let me close with a vision of another kind of productive activity that perhaps looks (but only looks), in the institutional optic, something like “doing nothing.”
For McLuhan, a medium was very much like a language: a fully rounded and local “world,” at once limiting, determining, immersive, and profoundly educative in its partial or complete incommensurability with many diachronically historical and synchronically contemporaneous others. How might it have turned out for McLuhan, one has to wonder, if rather than demanding the most that the one medium he mined could yield him, he had dedicated himself to something like adult second language acquisition?19 I mean language acquisition as a practice, the everyday practice not of a student, but of the fully trained and mature intellectual who rightly believes he will always have something better to do—and not as a means to an end, in immediately applied professional translation, but rather (or at least first) as a productive time out from critical productivity, yielding results only in the personal longue durée.20 In the lived time it takes to acquire,21 a second language stands not, or not only, as instrumental remedy for the literal or critical national-cultural monolingualism endemic, just for example, to “new media studies” itself, as a research field. For its acquisition is also, inevasibly and instructively, a figure for the antinomy or structural contradiction of a primary experience of (and of being-in-) mediation. [End Page 190]
Brian Lennon is Associate Professor of English and Comparative Literature at Pennsylvania State University, USA. He is the author of In Babel’s Shadow: Multilingual Literatures, Monolingual States (2010).
1. See Bush (1945). The essay was republished in Life following the bombings of Hiroshima and Nagasaki. See also Bush (1967, 75): “We are being buried in our own product.”
4. Defining climate as “the set of statistical properties that emerges at large temporal and spatial scales from well-known physical principles acting at much smaller scales,” Michael Tobis, Chad Schafer, Ian Foster, Robert Jacob, and John Anderson note that “successfully representing such large-scale climate phenomena while specifying only small-scale physics is an extremely computationally demanding endeavor,” going on to describe a climate model design exploiting parallel computing to achieve simulation “over 6000 times faster than real time with good fidelity.” See Tobis et al. (1997, 1-2).
5. See Bleja (2010). In a space beneath the visual console, Bleja’s notes for Breathing Earth helpfully suggest that “there are plenty of things that we can do to reduce our carbon footprint. The key word is reduce. We can greatly lessen our impact on climate change by using the planet’s resources more responsibly. There are many things we can reduce, and many ways we can reduce them, but three of the major ones are: reduce the amount of animal products you consume…reduce the amount of fuel you use…and reduce the amount of electricity you use. There are plenty of good resources on the web.”
6. See also McLemee (2011), who concludes: “I am writing this article, and you are reading it, on streams of electricity. Most of us think about electricity only when the circuit goes dead. The rest of the time, it is an invisible necessity—increasingly presupposed by literary culture itself, at least in what is sometimes called the world’s ‘overdeveloped’ economies. And in a way, this may be the next step for the critical project of analyzing literature’s ‘energy unconscious’: thinking about what happens to reading when the written word itself depends on raw power.” To this, one might add what McLemee leaves mostly to implication: that one must consider the “energy unconscious” of professional literary criticism and scholarship itself, as much as that of its literary objects, as a consideration bearing on some fundamental and unresolved questions about how (and perhaps more importantly, why) criticism and scholarship is produced, disseminated, and preserved.
7. Harootunian’s preferred term is “Euro-American.” I use the phrase “Euro-Atlantic” to mark two separate, if interdependent, ideological imaginaries and disciplinary acts: first, the Atlantic eclipse of the Mediterranean world in the age of European global colonization, as described in the early work of Fernand Braudel; second, the absorption, by a semiconservative U.S. American studies, of the figure of motion Paul Gilroy named “the black Atlantic.” Braudel emphasizes the geo-historical and thus temporal, as much as geo-spatial “turn” of the sixteenth century; against the spatial continental figure of the “Euro-American,” meanwhile, the “Euro-Atlantic” arguably captures something of the active capture of the indisciplinary movement and civilizational consequence (in modernity understood as slavery) of the Middle Passage. See Braudel (1995) and Gilroy (1993).
9. I was entirely serious, earlier, in invoking the personal consequences for McLuhan of discarding Auden’s injunction (in “Under Which Lyre”) “Thou shalt not worship projects.”
10. As long as writing, publishing, and reading tools still comprise manufactured materials, of course, such an infrastructure cannot be “entirely” (or even mostly) digital.
11. See Wissner-Gross (2009), Leake and Woods (2009) (with a clarification added January 16, 2009), “Powering a Google Search” (2009), Kincaid (2009), Dowell (2010), Vance (2010), and Gupta (2010). Kincaid writes: “An editorial piece Wissner-Gross wrote to accompany the widely-spread ‘Tea kettle’ article contains the passage ‘based on publicly available information, we have calculated that each Google search generates an estimated 5-10 g of CO2,’ which seems to indicate that the statistic came from his research. However, Wissner-Gross denies that he offered the ‘5-10 g’ figure as his own. In the draft he submitted to the Times, he referred only to ‘publicly available information,’ not to his calculations. [Jonathan] Leake has confirmed that the wording was changed during editing, but insists that Wissner-Gross claimed the statistic as one of his own findings during a phone conversation.” See also Glanz (2011).
13. See Nelson (1965, 96-97): “The physical universe is not all that decays. So do abstractions and categories. Human ideas, science, scholarship and language are constantly collapsing and unfolding….While the disappearance and up-ending of categories and subjects may be erratic, it never stops; and the meaning of this for information retrieval should be clear. Last week’s categories, perhaps last night’s field, may be gone today. To the extent that information retrieval is concerned with seeking true or ideal or permanent codes and categories—and even the most sophisticated ‘role indicator’ syntaxes are a form of this endeavor—to this extent, information retrieval seems to me to be fundamentally mistaken. The categories are chimerical (or temporal) and our categorization systems must evolve as they do” (emphasis in original).
14. See Nelson (1965, 97): “Information systems must have built in the capacity to accept the new categorization systems as they evolve from, or outside, the framework of the old. Not just the new material, but the capacity for new arrangements and indefinite rearrangements of the old, must be possible” (emphasis in original).
16. See Heidegger (2002, 74; 1957, 98): “The growing importance of the publishing business is not merely based on the fact that the publishers (through, for example, the book trade) have a better eye for the needs of the public, or that they understand business better than do authors. Rather, their distinctive work takes the form of a process of planning and organizing aimed, through the planned and limited publication of books and periodicals, at bringing the world into the picture the public has of it and securing it there.”
18. See Damrosch (1995), Said (1983), “The Extension of the Monograph Requirement; Rethinking the Preeminence of the Monograph; Revaluing the Essay,” in MLA Task Force (2006, 38-39), and Sidonie Smith (2010).
19. I mean “second language acquisition” in the conventional sense: acquiring a language other than one’s own first or “native” language or languages. “Second” refers here to temporal order, not number: a third or fourth language can be a “second” language in this sense, in relation either to a single first language, or one or more first languages.
20. This distinction lies at the heart of my In Babel’s Shadow: Multilingual Literatures, Monolingual States; see Lennon (2010). See also Perloff (2006). See also Spivak (2010, 36): “the task of the translator…might be to rethink the current workaday definition of translation and try to make translation the beginning, on the way to language learning, rather than the end” (Spivak counterposes this to the activities of “impresarios of a multicultural circus in English”).