publisher colophon

CHAPTER 3

An ABCs of Post-Theoretical Academic Style

AUDIENCE

Postmodern architecture, Charles Jencks (1984) tells us, is “doublevoiced.” It addresses two audiences at once: the public who uses and sees the building, and the professional cadre of architects. For many (although not all) academics, the work would not seem worth doing if not for the (admittedly strained) pretense of its potential impact beyond the university and the national professional organizations. The nonacademic audience is a necessary fiction, a shaping presence upon the work.

Broadly speaking, social scientists imagine their work reaching policy wonks, whereas humanities professors dream of the general reading public. Only scientists get to be single-voiced; they write for the fifty or so other people in the world who are working on and could possibly understand their corner of the universe. (Scientists have their own necessary fictions, but they dream of impacts that don’t depend specifically on audience.)

The styles of different academic work are shaped in relation to the imagined audience. The topics, the modes of argumentation, the authorities cited or argued against, and the amount and kinds of documentation are mandated by the protocols of the discipline or profession. But the discursive style—telegraphic almost to the point of bulleted speaking points on the one extreme, belles-lettres at the other—indicates the non-academic reader that the writer dreams of swaying.

Theory—taken in the all-encompassing sense of a quasi-philosophical discourse that questions the assumptions of the various disciplines—makes the problem of audience more acute. Theory is more abstract, more abstruse, and more self-reflexive, hence more “academic” in every sense of the term. (By “post-theoretical” I mean work undertaken in an academic landscape permanently altered by theory, not some presumed “death of theory” followed by a restoration of the pre-theoretical universe. And my observations here pertain only to the “American scene.”) Yet theory usually comes with a social and political agenda born of a suspicion of the academy and the way it organizes knowledge in disciplines and aids power through the institutional regulation of knowledge. So theory hardly relinquishes the hope of having extra-academic effects. And theory, insofar as it is successful, unsettles the unanimity of the professional audience, while also inviting academics outside any one discipline to eavesdrop on and even criticize the work within that discipline. Academic work is potentially addressed to even more audiences now, and those audiences are less likely to be homogeneous within themselves.

BOURDIEU

Academics take pride in the work they do. They believe it is necessary work and that its necessity justifies the honors and salaries attached to it. But they also know that ambition drives much of their efforts and that professional advancement rather than an impact on the world is the most likely fruit of any particular piece of work. How to avoid cynicism under such circumstances?

Teaching helps. Most academic writers regularly face non-academic audiences in the classroom, and much academic prose is shaped by pedagogic purposes. Making one’s knowledge and expertise available to the noninitiated takes some of the sting out of participation in the kind of professional maneuvering that harps on ownership of ideas, exclusivity of innovative work, and competitive assessments of others’ work.

Some kind of belief in progress is also necessary. In one way or another, the academic thinks that the production and acquisition of knowledge is connected to making the world better. Just as the teacher aims to leave her students better off at the end of the term, academic writers want to better the world, not harm it. Even where they think their chances of a positive contribution are nil (because from their marginal position they have no lever with which to move history), they usually believe their efforts, while pathetic, are harmless.

Where the belief in the work’s salutary effects joins the pedagogic impulse in essays addressed to the professional audience, we get the weird effects characteristic of a contemporary academic style that is also cognizant of the rising status of the professional middle class at the expense of the non-professional middle class. The stakes in professionalization are so much higher now because the economic and social gap between professionals and non-professionals is wider. While the long economic boom has made entrance into certain professions easier, there has emerged a more clearly marked hierarchical system in higher education. Graduating from a “good” school is more important than ever before. The job market for academics is generally much tighter than that in almost all the other professions, although conditions vary from one academic specialty to another. And despite the fact that almost every school now requires some professional publication by its faculty, the gap between the working conditions, salary, and privileges of faculty at “research universities” and their colleagues at lower-rung state universities and community colleges has never been wider. That gap is also institutionalized at many of the research universities themselves, where untenured and nontenurable adjuncts teach the lower level courses for miserable pay while the “research faculty” writes its essays and books. That written work becomes more professional (more specialized, more technical, more oriented to disciplinary disputes, more footnoted) corresponds to this greater differentiation of professionals. Yet the same work, when done by academics who are teachers, often implies an ethos of providing the author’s knowledge to all comers and often makes explicit claims about the social good the author’s knowledge or arguments could serve. Increased professionalization, in other words, is accompanied by more fulsome claims about the wide-ranging benefits of professional work. The writer often appears to be trying to convince himself.

I suppose innocence about such matters prevails in some quarters. Not every academic has read Bourdieu and thus has not had to face the challenge of cynicism directly. But in America the academic professions have offered such a standard way to escape the middle middle class for its upper reaches (with the resultant—and inevitable—alienation from one’s family and origins), that the split between the professional and the nonprofessional is a lived reality for many academics. Top-echelon research professors do not have to have read Bourdieu to understand that their jobs and privileges look like a scam to the folks back home—and to the adjuncts and graduate students on their own campuses. And so their work, even at its most academic, also addresses that skeptical audience (lodged within as well as back home) to affirm the benefits and nobility of the pursuit of knowledge.

CRITIQUE

The folks back home are also troubled by how “negative” academics are. Academics are so critical of everything; nothing ever meets their standards. Marx’s call for “a relentless critique of everything existing” might be written over the lintel of the modern university. Colleges as the repositories of received knowledge, with professors as the custodians of the tradition, have yielded to the multiversity, with its emphasis on the production of new knowledge (the sciences and the quantitative half of the social sciences) or on the critique of existing knowledge (the qualitative social sciences and the humanities).

Critique, of course, has tried to undermine the faith in progress attached to the production of new knowledge. The relentless critique of everything has now reached to the efficacy of critique itself. Another form of cynicism lurks here. Does critical reflection, lucidity about the social and intellectual processes by which habits and values are formed, gain us anything?

The watchword of critique has always been that the truth will set you free. And that faith has proven marvelously resistant to attack over the past thirty years. Much work in the humanities and social sciences still follows the path of demystification, revealing the true motives, actual causes, hidden structures, and processes behind appearances that are consistently misunderstood by the majority of social actors. If only we can replace such mistaken beliefs (that gender is natural, that our hierarchies reflect merit) with recognitions of the true state of affairs, we will be better off.

The arrogance of this position is among the least reasons that it has come under increasing attack. More prominent have been worries about truth claims. Why believe reflection superior to first impressions? What could stand as independent criteria for reflection’s being correct? Humans lie to themselves all the time. What exempts critique’s reflections from being “rationalizations” (in Freud’s sense of that term)? The “critical distance” and “cool, hard-headed” style for which critique congratulates itself runs athwart recent arguments about the situated character of all thought and the complete interpenetration of thought and emotion.

Furthermore, the translation of knowledge and understanding into action is problematic. What moves someone to act? I know that funding for public schools is grossly unequal across the various school districts in my state, and I believe this is wrong. What am I doing in relation to this knowledge and this belief? Not as much as I am doing to write this essay. There are many options for action in relation to our multiple convictions, and there is no simple path from conviction to action. We can know that capitalism is corrupt and exploitative, believe (and the temptation here is to add an emphatic like “strongly” or “truly”) that capitalism’s practices are wrong, and yet not change our behavior very much from the ways we acted prior to acquiring that knowledge and those convictions.

Such considerations make the rhetorical as opposed to the informational component of critical writing more salient. If “engaged” academic writing has become more “performative,” it is because critique has come to the point where it questions its own simplistic faith in the power of knowledge. Of course, rhetoric and knowledge are not opposites. Facts, arguments, anecdotes (stories true or false), examples, and moral principles are always presented in a particularistic style in relation to a projected audience. But contemporary work is much less likely to believe that “the facts will speak for themselves.” Even if the writer has no grander design than to convince the readers of an academic journal to publish his or her essay, the contestation within theory of just about every possible position means that assertions must be carefully constructed and buttressed.

Where the aim is grander, we might say that “critique” of the Frankfurt School variety had yielded to the kind of hegemonic work described by Laclau and Mouffe (1985) or Stuart Hall (1988). It is not enough to lay bare the bones of our social reality. The writer must also, like Mr. Venus in Our Mutual Friend, articulate the bones. The writer must create the skeletal frame out of the scattered facts and values lying to hand in the current moment and strive through words to breathe life into that body.

DIFFICULTY

What happens when you cross the godfather with a poststructuralist? You get an offer you can’t understand.

Theory is difficult. That statement is almost always a complaint. Most pointedly, the difficulty of theory appears in direct contradiction to its ostensible political goals. If these theorists want to have an impact on society, it seems absurd to write essays and books that less than one percent of the population can read. Manifestos with footnotes capture the laughable plight of today’s would-be radical intellectual, a careerist in the university who believes himself a threat to the status quo. Luckily, he has Roger Kimball to bolster his self-esteem.

The situation is more complex, more difficult, than the common complaint allows. Note that what might be called the literacy gap parallels the economic gap that has opened up between the professional-managerial class and the rest since 1965. Graduate students must master much more difficult material and write in a more difficult style to get jobs teaching students who will be less literate than the students of forty years ago. While the university (especially in graduate studies) have become hyperliterate, the school system (from community colleges on down) has become less literate.

But let’s not get sidetracked by nostalgia. Fifteen percent of the population reads eighty-five percent of the books that get read in the United States. There is no reason to believe that these numbers have changed significantly over the years. The vast majority of adults read very little at all, so it is ingenuous to blame an academic for reaching less than one percent of the population when the best-selling novelist reaches three percent. Any argument about the impact of books must take a trickle-down form because no book can compete with the mass media for direct contact with audiences. Books, for better or worse, are directed to a small minority, and what no one wants to face (in the democratic context of the United States) is that we cannot assess books’ impact without considering the power of elites to influence society out of all proportion to their numbers.

The real issue of difficulty, then, is what audience among book readers the author hopes to reach. Difficulty is rewarded if one seeks acclaim as original or as being a major thinker. (Of course, just being difficult insures neither of these rewards.) But difficulty can jar with populist aspirations, the desire to imagine (or even achieve) contact with an audience beyond the academy. Richard Rorty offers an instructive example here. He quite deliberately abandoned the technical style of his early work on the philosophy of mind for the breezy, synthetic, and dramatically binarized (public vs. private, solidarity vs. irony) style of the later work. As a result, he became the philosopher most likely to be read by non-philosophers and occupies the public place once held by John Dewey and Bertrand Russell, a place that had been vacant for many years because no philosopher chose to break with discipline-specific notions of rigor, care, and difficulty.

The word “chose” in the previous sentence makes me nervous. Rorty obviously did make some choices. But style is not infinitely malleable, not a simple matter of choice. The difficulty of writing stems partly from this resistance—of what to what? To say the words resist one’s ideas doesn’t seem quite right. In any case, it is difficult to express what one wants to say in writing. Writing is an endlessly frustrating—and perhaps for that reason endlessly fascinating—enterprise.

This brings me to my last two thoughts on difficulty. Pre-existing models and forms create worn grooves that can make writing less difficult—both for the writer and the audience. Formal experimentation is, of course, a hallmark of modernism in the arts. Where the thought aims to be new, a new form must be invented for its expression. Theory, in general, partakes of this commitment to novelty and the concomitant attraction to new forms. So it sets itself as well as its reader a difficult task.

But difficulty also seems an outgrowth of temperamental imperatives. A writer is driven to worry a point and the reader is, finally, willing or not to follow the writer down that path. Proust manages to get me (but hardly every reader) enchanted by his two hundred page consideration of the difficulties of getting to sleep. I find Stanley Cavell’s idiosyncratic self-indulgences charming, because his explorations of his obsessions strike me as productive. But my love of Derrida as a close reader is tested by my impatience with his starting so many essays with a meditation on the difficulties of getting started. In my own writing, I find the question of where to cut short, of where to stop complicating the issue, continually troubling. How much of my audience am I losing at each new turn of the screw?

EXEMPLARY

The divide between quantitative and qualitative work is as deep as the divide between the “hard” sciences and all the other disciplines. Qualitative work is burdened and blessed by the perils of exemplification.

The example is synecdochic; it is the part that represents the whole. The problems raised by claims to typicality are endless—and thus serve to generate lots of revisionist work for academics. Victorian culture looks like one thing if my sample comes from its novels and poems, quite another if I take newspaper articles and court cases as exemplary. More intriguing than the endless disputes about what the Victorians believed tout court is the question of whether such holistic claims must underlie our attention to and interpretation of particulars. It is the rare academic work that does not justify its attention to particulars with some gesture toward what that attention reveals about the wider social reality (as if it is self-evident that the audience is more interested in and the work more justified by claims about Victorian culture than the story of Tennyson’s invalid wife). The handling of examples, in other words, indicates where academics believe significance lies. What the example is taken to be an example of names the target of the work.

From the left (as it were), Derrida and others have questioned the ability of the example to do this work, while the quantoids (from the right) have always scoffed at using examples to make general claims. The example is both never enough and always too much. It is never enough to secure the general claim while, in all its wonderful detail, it always provides too much material. How does the investigator decide which details are exemplary, which supererogatory? The suspicion deepens that he finds what he came to seek, unless (happily?) he manages to let the example’s particularities distract him.

There is a hermeneutic circle here. The writer chooses the example(s) for his study guided by what he wishes to emphasize. Underneath or alongside the use of the example to make holistic claims is the commitment to certain holistic claims, a commitment connected to what the writer wants his work to do. The example, then, is moral insofar as it unites purposes both enacted (by the writer) and urged (by the writer upon himself and/or audience). So, for example, Mary Poovey (1988) takes Dickens’s David Copperfield as her example of the gendered professionalization of the literary author. Her choice is invested by her desire to tell a cautionary tale about what she claims is our culture’s dominant image of authorship, an image derived from historical developments during the Victorian era. Her own work strives to be an example of how to question that dominant image.

For the professional audience, the work is also an example of how to do work. Even in the quantitative and hard sciences, work is important not just in terms of what is studied or concluded, but also in terms of how the study is conducted. The illustration and implementation of methodologies is important—and is considered the “theoretical” part of the work in some disciplines. This attention to methodology is both messier and more central in qualitative work. With the questioning of all assumptions that has characterized theory, every work in the humanities exemplifies as well as argues for a certain way of doing work. The writer must position herself amidst competing models even as the method is modeled. Formal experimentation goes hand-in-hand with methodological exemplification when a consensus about the correct way to proceed does not exist.

FIELD

I want to write that in qualitative work there used to be disciplines, but now there are fields. Unfortunately, I do not believe that is true. Disciplines have proved awfully resilient. The inertia of academic institutions should never be underestimated. The structural force of institutional arrangements (especially departments and the organization of undergraduate majors and graduate degrees along departmental lines) carries much before it even after the intellectual rationale and/or unity of the disciplines has been lost.

A field is defined by whom one reads and to whom one addresses one’s work. We can take this in a Bakhtinian way: my field is shaped by myself and my interlocutors. It is not a question of whom I agree with, but of whom I must engage. The “must” indicates that these choices are primarily not personal ones. If I want to publish in a certain place, to become a participant in a certain field, I must engage the prevailing voices in the field as currently constituted. In my field of literary theory, for example, I cannot talk about examples without at least mentioning Derrida or about intellectuals without considering Bourdieu. But I can ignore Nelson Goodman on examples completely, and Shils and Gouldner on intellectuals.

The point is that many topics are shared by different fields. The difference resides not in the object studied, but in the constellation of positions about the object that the writer takes into account. Thus queer theorists have to consider psychoanalytic theories about human sexuality but can ignore with complete impunity biomedical work on the same topic. To some extent, biologists are beneath their notice; they have other people they want to engage, impress, convince. Fields rely on quick dismissals, founded almost entirely on ignorance. “There is nothing of value and interest being said by those people.” Contempt circulates promiscuously between and among academic fields.

Much of this is a life-saving strategy. Theory has brought the injunction that one should read everything. It looks with suspicion on the traditional academic disclaimer: “That’s not my field.” But theory has had to develop its own forms of dismissal, since there is simply too much out there. We all need some way to account for our choices, to explain our ignorances. But I am discouraged that fields—which seemed to offer such interdisciplinary promise—have often been as bad as the disciplines. I hate the close-minded disciplinary attack on “theory” as the misguided attempt of literary critics or historians to be “philosophers” or “sociologists,” as if all these enterprises cannot overlap or as if only strictly disciplinary training enables productive, instructive, and excellent work. Theory’s interdisciplinarity, its construction of a field across the boundaries of disciplinary homes, is one of its great strengths. But I am equally distressed when I see theorists dismiss Anglo-American philosophy or liberal political theory wholesale, reading such work (when they do at all) with the foreknowledge that their job is to disagree with every word they read.

Fields, then, are both smaller than disciplines (which often encompass several fields) and potentially larger (because attuned to work in numerous disciplines). The trick is to try to import a new voice or new perspective into the current constellation. Success is very dependent on authority—starting with the establishment of one’s own authority within the field (a laborious process). As Stanley Fish (1999) has argued, in most cases only an author possessed of such authority can succeed in introducing new material into the field. This suggests that innovation comes from established practitioners who first gained an audience by doing more conventional work.

Fields, like disciplines, are fluid, and are scenes for professional ambition. Fields even have their professional organizations and journals. But fields cut across the institutions structured according to disciplines and challenge the codifications of disciplinary training, disciplinary methodologies, and disciplinary right of access to particular object domains. The disciplines are far from dead, but they now have to co-exist (uneasily) with fields that refuse to simply function as sub-areas within the disciplines.

GRAND GESTURES

The most obvious impact of “theory” on American academic style has been the adoption of grand gestures in matters intellectual and political. American academics prior to 1965 generally followed the middle way advocated by Robert Merton, eschewing grand theorizing or allencompassing projects. True, literary critics believed in epochal unities called Romanticism, Modernism, and the like, while also crediting (as did historians) notions of national identity. But they used terms like capitalism and imperialism rarely, and patriarchy, Western metaphysics, disciplinary society, and phallologocentrism were unknown.

There were, of course, engaged intellectuals, especially in New York City. But their style was rarely denunciatory on the grand scale, and even before 1950 the words “socialism” and “communism” were used tentatively in most cases. This was more a matter of decorum, of an intellectual antipathy to the unsubtle, than a matter of politics. Grand denunciations of Western civilization as rotten to the core did not enter American intellectual or academic life (these two overlap, but are not everywhere the same) until the Frankfurt School and French poststructuralism joined large-scale analyses with overt political commitment. The style caught on in America and is still around, although in decline. A certain diffidence, which was also a certain kind of irony, almost disappeared for twenty years (1970-1990), replaced by earnest declarations of political purposes and the ability of academic work to further those purposes.

For a skeptic like myself, who wants finer-grained analyses and a more tempered view of the connection between academic work and political effects, the decline of the grand gesture is still to be mourned. Ambitious, provocative, energetic, and idealistic work is always in short supply, even when (as is very rarely the case) academic fashion favors such work. And there is the suspicion that the current decline was largely the result of the constant sniping, inside the academy and out, about “political correctness.” While I, too, cringed at the excesses of much leftist work, I’d rather take my stand with the leftists—both politically and stylistically—than with their enemies. I feel answerable to (because I share many of) the political aspirations of the left, while leftists constitute one of the audiences I want to read and be influenced by my work.

The recent decline of the grand gesture, the return of academic modesty, has also influenced the shape of academic careers. The baby-boomers’ attraction to grand theorists made various academics born between 1925 and 1950 attain prominence within the academy in their forties. Many of these magisterial figures—Harold Bloom, J. Hillis Miller, Edward Said, Sandra M. Gilbert, and Fredric Jameson in literary studies—are still active, but very few critics born after 1950 (Judith Butler, Henry Louis Gates, Jr., bell hooks, and Eve Kosofsky Sedgwick are the prime exceptions) have written books that everyone in literary studies feels they must read. After a period when “theory” served as a lingua franca, literary criticism has fragmented again, and reputations have become more localized even in those cases where the ambitions are larger.

HERMENEUTICS

Hermeneutics, the art of interpretation, as contrasted to the gathering of fact (historical investigation) or the presentation of causal explanations has irrevocably, it seems to me, upset the apple-cart of positivism. Even in the quantitative social sciences, “interpreting the data” now includes a heightened sensitivity to the categories that underlie statistical groupings, while the “hard” sciences will never again enjoy the unquestioned epistemological prestige they possessed prior to the advent of theory. Outside the quantitative fields, interpretation has won out over biography, editorial and other kinds of recovery work, and straightforward historical narrative; it is more prestigious and more universally required of students.

To the extent that the most prestige attaches to “theory,” we should recognize how much of theory is interpretation of large-scale social structures and patterns. Literature departments were the first to embrace theory because they were already much involved in interpretation. But the locus of interpretation shifted from the single text to that text as symptom or representative of larger cultural forces. Just as theory was refracted through the interpretive practices of close reading in literature departments, it is refracted through specific interpretive traditions in anthropology, sociology, history, and the like when it makes an impact in those fields.

The increased emphasis on interpretation coincides with the loss of methodological consensus. When there is a proliferation of ways to do the work in any particular discipline or field, facts are more obviously byproducts of interpretive strategies, and those strategies must be more selfconsciously deployed amidst competing possibilities. If the arrival of theory is experienced as a fall, that’s because the serenity of self-evidence now has departed, presumably never to be regained.

IDENTITY

The perplexities of identity are endless. I’ll start with a minor puzzle: How did the poststructuralist focus on difference (Derrida’s différancé) transmute into the obsession with identity?

There are two prominent images of modernity. The first is generally conservative and emphasizes how a rampant individualism has destroyed communal values, external checks on selfish behavior, and social order. Modernity is the chaos of all against all in the unbridled competition of a world shorn of all transcendent meaning. This vision can look radical when it inveighs against a godless capitalism. It invariably pits a nostalgia for community, imagined as a vague noncoercive fellowship with like-minded others, against experiences of anomie. The hallmark of this conservative vision is a willingness to trade in some individual freedom for order, authority, consensus, and fellowship. (That this vision co-exists with a nostalgic attachment to an individualistic entrepreneurial capitalism is just one of the internal contradictions of contemporary conservatism.)

The other account of modernity declares the notion of increasing individualism to be a myth. The modern world according to Weber, Adorno, and Foucault features increasingly powerful bureaucracies (both state and corporate) which manage individual lives down to the smallest detail. All modern societies tend toward totalitarianism, either of the overt type or of the more insidious forms that produce “mass society” through cultural institutions (schools, mass media, sports, the arts) and corporate capitalism. Centralized government, vast business enterprises, and the large-scale production of cultural meaning/value prevails. Differences are steam-rolled into sameness, or, worse, recalcitrant differences are ruthlessly eliminated. (In Foucault’s more diabolical version, differences are produced in order to make domination more effective.) Modem societies are less, not more, free than premodern ones. Under such circumstances, the Frankfurt School concluded that struggling to retain autonomous individuality (despite the fact that autonomy is the keystone of Kantian liberalism) is a radical response—and about the only one available in dark times.

An obsession with difference, then, coincides with the claim that modernity works to eliminate differences. But where is difference to be located? Poststructuralism, generally, does not locate difference at the level of the self. Derrida is not talking about the difference between one self and another. Mostly he seems to locate difference in units even smaller than selves. Individuals are fragmented, constituted of conflicting elements that threaten the very coherence of the idea of, the claim to, a self. Difference understood this way deconstructs identity.

But Derrida also, at times, locates difference in entities much larger than the self. He often writes as if “Western metaphysics” has an identity, while pointing toward the “Other” of metaphysics. The Other’s difference functions rather differently than différance.

On the one hand, the Other is located in cultural formations or traditions that are distinct from or resistant to Western metaphysics. Here poststructuralism links up with what might be called “culturalism,” defined as the poising of local, cultural differences against the universalizing juggernaut of modernization that is resisted through loyalty to and defense of specific “ways of life.” One’s identity stems from culture, not from soulless, uprooted modernity. Thus “identity” (as a source of meanings and motivations) names what is at stake in joining battle with modernity.

On the other hand, it is very hard to document modernity’s crimes against the Other without, in the end, dealing with the individual. Walter Benn Michaels (1996) is interesting on this topic. How, he asks, do we characterize the crime of the Holocaust? Is it the attempted extermination of an entire culture or is it the murder of six million people? If we define genocide as the extermination of a culture, we seem to be valuing the culture over the lives, especially if we are aiming to say that murder is bad, but genocide is worse. Even if we do think there is a significant difference between genocide and mass murder, it remains the fact that mass murder is a crucial means to genocide. My point is that harm to otherness is almost always going to be measured, at some point, in terms of harm done to individual bodies. Thus it is difficult to have a discourse of otherness that does not locate otherness in individuality at least some of the time (where individuality is understood as marked off by the physical separateness of one body from another.) Identity is conferred by being the one subject to this harm, this injury. To the extent that poststructuralism focuses on the violence done to the other out of intolerance of difference, that other is going to be located, to some extent, at the level of the individual (the victim) who suffers that violence. And here poststructuralism comes very close (despite all its protests to the contrary) to espousing liberal notions of individual rights and pluralism.

The result has been a general confounding of poststructuralist thought with identity politics. Since the crudities and excesses of identity politics provide an easy target, both conservatives and anti-theory leftists have been quick to castigate theory for unleashing the multiculturalist hordes. To a large extent, the theoretical left has caved into this attack. Judith Butler (1990), Wendy Brown (1995), and a slew of collections with titles like After Identity (Danielsen and Engle, 1995) and The Identity in Question (Rajchman, 1995) have marked high academic theory’s abandonment of its erstwhile allies in the trenches of identity politics. The identity folks were an embarrassment. When it came to a choice between political alliances across differences in sensibility that reached in some instances (especially in feminism) beyond the academy and respect from their academic peers, the theoretical left only hesitated briefly before choosing their peers.

I will not defend identity politics from incoherence. How assertions and celebrations of differences fit in with notions of determination by group membership remains a mystery. But the emotional force of appeals to identity and the ability of such appeals to move people to action is worth a longer look. Here, surprisingly, I think Derrida rather than Foucault is closer to the impulses behind identity politics than one might at first suspect. The difference (dare I say) is between religious and secular outlooks, between piety and impiety. Modernity, generally, is secular. It understands identity as the self-creation of the self through action. Identity is out in front of me, in the future, to be made. The past is not a determinant. I can transcend my origins as I pursue a career open to talents. Here we have the quintessential American myth of the self-made man with the concomitant American impiety toward origins (and parents) and indifference toward the past. This rootlessness, this resting of identity on achievement rather than ethnicity, religion, gender, or region is characteristic of the professional class, of those who have benefited from and feel at home in modern society. They have used their careers precisely to escape their pasts.

But there is a whole different emotional relation to identity, and it is no surprise (careerist that I am) that I don’t get it. Highly significant is the fact that this other sensibility now inhabits the professional world, instead of being checked at the door as the price of admission to it. Moreover, there is a generational divide here between those born before i960 and those born after. For this later generation, there is a piety toward roots, toward the places from which one comes, toward the social markers of identity which is primary. Identity is not out in front of me, but back there from whence I came, and my goal is to express that identity in a world hostile to it (because of its differences). Identity is to be defended against the world’s onslaughts against it, not abandoned and continually remade within the worldly structures of a career. Crucially, identity is not located within the self, not something the self makes and remakes as it proceeds. Rather, identity is lodged elsewhere, beyond and outside the self, and the issue is how to align the self with that elsewhere, to be true to its demands despite the world’s attempts to seduce me away. Derrida would use very different terms than my students, but pious references to that elsewhere, and to the responsibilities it enjoins to the self, pervade his work. The primacy of the Other is what both gives the self an identity and the responsibility to protect it (as an otherness) against the stripping away of identity toward sameness characteristic of modernity.

JUSTICE

The old left focused on justice, especially economic justice. But the emphasis of the social movements of the 1950s through the 1980s was more on “rights” and “liberation.” Academic theory followed suit. Freedom, not justice, was the major concern. This shift reflected the fact that issues of liberation (respect for identity differences, the end to legal discrimination against various stigmatized groups, struggles to expand the franchise and citizen participation, resistance to state compulsion in non-economic matters such as the draft, abortion, and sexual practices) proved more powerful than economic issues in provoking popular political action. There was also a need to reject a moribund Marxist tradition, to invent a “new left.”

Now, in the 1990s, there has been an attempt to revive concepts of justice, especially in the environmental justice movement, but also in relation to welfare reform, and to the on-going widening of the gap between the haves and have-nots. Despite some gestures toward justice in theory circles (notably Derrida’s wonderful and frustrating essay “Force of Law” [1992]), matters of justice still remain under-discussed. Political concern in academic work still centers in a cultural politics of representation, resignification, and liberation from the limits of received thought. Such cultural politics often looks therapeutic, focused on exposing and combating social pathologies like sexism and racism as psychological rather than institutional matters.

It is no surprise that academics who locate the most effective point of intervention at the cultural level are prone to use psychoanalytic terms and theory.

I know, intellectually, that this is a time worn and fruitless internal political squabble on the left. Do we most effectively promote change by reforming social institutions or by transforming people’s heads? The answer is that work on both fronts is necessary, and that nothing guarantees the effectiveness of either strategy. You do the work you can where you are, without knowing how or if it will make any difference in the short or the long run. “Pessimism of the intellect, optimism of the will.” And the experiences of the past fifty years, especially of the civil rights’ movement and its aftermath, do seem to indicate the crucial importance, if not priority, of cultural politics. Ending legal discrimination hardly ended racism; fundamental shifts in belief, attitude, the taken-for-granted, and the habitual appear necessary to any progress in race relations.

So why do I still find cultural politics suspect, suspicious both of its analyses of the problems and its proposed solutions? Recognizing that I am probably being unfair, I still cannot help finding the characteristic discourses of cultural politics arrogant. The vast social majority is presented as benighted, unaware of how they actually think, how they process their experiences and make their decisions, unlike the enlightened writer, who holds the interpretive key to society’s unconscious. Moreover, I cannot help seeing the displacement of economic and political inequality by an ideological terrain of beliefs, values, and attitudes serving to distract us from the relatively privileged position from which the authors of ideological critique always write. My response is shot through and through with intellectual and class ressentiment, and it is worth saying that Martha Nussbaum’s high-toned moralism elicits the same response in me as Slavoj Žižek’s bombastic psychologism. In part, I am a vulgar Marxist—and a vulgar liberal of the J. S. Mill and John Rawls variety—who insists that provision of the economic means for a good life (substantially beyond subsistence) is the sine non qua of a just society. Since our society hardly meets this standard, the first political duty is to point out that shortfall, and the second duty is to work to eliminate it. I understand that cultural politics pursues its indirect method because it believes that direct efforts have failed through the social psyche’s inability to apprehend the problem of unequal distribution. But 65 percent of me believes cultural politics is too subtle by half. The people on the bottom know they are being screwed and the people on top know they are screwing them. The resistance to change isn’t psychological, a matter of false consciousness or subject formation; it is simply the power of the powerful to maintain arrangements that suit them. No sooner do I write this, however, then my other 35 percent thinks of the convenient lies the powerful tell themselves (about effort, and merit, and opportunity) to get off the hook for the purposeful perpetuation of injustice, and of the “hidden injuries of class,” of the ways the poor believe that they deserve their fate.

KNOWLEDGE AND TRANSFORMATION

Are interpretations knowledge? What exactly is produced when we “read” a text or an event? The hankering after knowledge, defined as the delineation of fact, of truths about a mind-independent reality, is still strongly present after three hundred plus years of epistemological battering. Even though more and more post-theoretical writers are trying to kick the habit, prevailing practices run against anti-realist principles. Canons of evidence (quoting from the text; offering statistics; referring to historical events and dates) assume a world out there and discernible facts pertaining to it. A pure anti-realism is probably unattainable, so we are not going to bypass the epistemological woes attending claims to knowledge by simply declaring that we make no such claims. But we can try to decenter knowledge claims, shifting the emphasis from what our work tells us about the world that existed before we wrote to how our work acts to shape the world that will exist tomorrow.

Writers are engaged in a species of magic. Freud discovered in the “talking cure” that to name a past that had been unnamed (unremembered) enabled the patient to project a new future. It hardly mattered if this act of naming was accurate in any traditional sense. What matters is that the patient has taken charge of his or her own life, has assumed the ability and the right to name the past and thus to name and own the future. This naming will not acquire reality, will not actually create a future, unless it is endorsed by others. Efficacious magic is a social, not a solipsistic, act. Others, however, do not have to endorse the truth of my naming; they may even vehemently object that I have gotten it totally wrong. The important thing is that they recognize my action and respond to it. I have already done something in that case. My action is an action because it provokes a response, puts me into new relations with those who respond, as well as to those things I have newly named. If we re-imagine our academic work as transformative action upon and within the world, its statuses knowledge becomes secondary. Or, we might say, its status as knowledge is more about the intersubjective relations of addressing others (i.e., rhetoric) than about the lineaments of reality.

Let’s, following Hannah Arendt, be fancy about it. This understanding of intellectual activity as the public enunciation of interpretive namings demotes epistemology (knowledge of the world) and promotes ontology (the creation of the world through dialogic interaction with others and objects).

KNOWLEDGE AND MONEY

While humanists pursue sweet dreams of creating the world through dialogic work, the university might be stolen out from under us. We live, as the pundits never fail to tell us, in an information age, in the knowledge economy. Universities have two functions: to educate students and to produce knowledge. The old idea was that the knowledge was disseminated in the classroom and through publication. It was placed in the public sphere, labeled as to origin (author), but underwritten financially by tuition dollars and general public investment (via taxes and philanthropy) in the university. Of course, there was some specifically commissioned research, especially that done for the government within the context of the Cold War. But foundations and donors generally took a hands-off approach to research topics and, more importantly, didn’t claim proprietary rights to the research results.

All that has changed drastically in the past ten to fifteen years. While government funding has leveled off, the corporate world has increasingly turned to universities for specific research needs. And the recognition that the knowledge produced in (especially) scientific research has (sometimes) immense economic value has led to a sea-change in how research is commissioned and what happens to its results. Increasingly, new knowledge is licensed or patented, with the researcher, the university, and the corporate sponsor receiving designated shares in the product. Publication, even public discussion, of research results is delayed until licensing or a patent is secured. Professors in the fields effected are now as much entrepreneurs as academics, moving between the university laboratory and the business world.

Universities have gone down this path because they are money pits. Tuition—even with rises that greatly exceed the inflation rate—has never covered the costs of maintaining a university, especially a research university. The federal government underwrote much of that cost during the Cold War, and the humanities existed on the overspill of the federal largesse. But the corporate dollars that have stepped into the vacuum left by the shrinking of federal dollars are more directed than federal dollars, less tolerant of massive “indirect cost” rates. As a result, the humanities are in danger of withering away. Their only resource in the competition for dollars are alumni donors who retain a sentimental attachment to undergraduate liberal arts programs. Since the wealthiest donors, however, come from the same corporate world that is forging this new relation to the university, even individual donations are becoming more and more specifically targeted. The humanities increasingly have to “market” themselves and have to develop specific programs in response to donor demands or in the attempt to attract donor dollars. If the science professor is half entrepreneur, half academic, the humanities professor is on the way to becoming half fund-raiser and PR man. Universities are engaged in an endless search for money, and various units of the university are in competition for limited access to identified donors. Not having a product to sell to corporations places the humanities at a distinct disadvantage.

Of course, licensing is going to come to the humanities as well. There was always the opportunity to make a little money on the side by writing a textbook or editing an anthology. Such work was looked down upon, and you could only get away with it if you didn’t do it to the exclusion of more prestigious (less remunerative) work, or if you simply brazened out your colleagues’ disapproval. But the Internet may change this game by dramatically changing the sums of money in question. Right now, of course, copyright and its relation to the Internet is in flux. But humanists, while sometimes obsessed with intellectual property rights in ideas, have not had economic reasons for that obsession. Whether or not work in the humanities will actually attain any great economic value, we should fully expect speculative action based on that possibility. Licensing arrangements are going to become more prevalent—and will undoubtedly effect how some humanities professors view their work and their careers.

Beyond the sentimental value of the alma mater, all that remains to the liberal arts is prestige value. “Culture” of the highbrow sort still retains some value, although less and less all the time. But the prestige or “brand” value of the top universities has never been higher. University and college presidents rise and fall on the basis on the annual U.S. News and World Report ratings—and lower administrators have the squeeze put on them to pull their units up in the rankings. So the humanities do have some leverage on the general finances of the university because the humanities disproportionately (in relation to research dollars generated and numbers of majors) influence an institution’s prestige. The humanities are a luxury item (for students as well as for universities) and, like most luxury goods, play a major role in both determining and representing status. For those of us who are foolish enough to take the humanities seriously, to believe in their transformative potential, the funding offered by bemused functionaries who find a little culture adds luster is just about worse than no funding at all.

LOVE

Critics of ideological readings of literature often complain that the “love of literature” has all but disappeared from today’s English departments. What has love got to do with it? Consider the following statements:

“I teach physics because I love electrons.”
“I teach the history of slavery because I love slavery.”
“I teach Shakespeare because I love Shakespeare.”

Professionals, as opposed to businessmen, are supposed to love their work, to have other than mercenary motives for their undertakings. That is why, as Stanley Fish (1994) points out, professionals (with the exception of doctors and lawyers) desire handsome but not extravagant salaries (unlike corporate executives, who apparently have no qualms), and have elaborate non-flaunty ways to spend their earnings.

But to love your work does not necessarily translate into loving the object you work upon. The additional demand on English, music, classics, and art history professors has to do with aesthetics, not professionalism or education. English and other aesthetic departments often feel on the defensive in universities that seem increasingly driven to justify their work on utilitarian grounds. Especially in research universities, such departments are expected to publish, to make their contribution to knowledge. Yet sentimentality about the arts, about “culture” as something to be appreciated, generates a hostility toward the probing and questioning of Shakespeare even more than toward the demythologizing of George Washington or Thomas Jefferson. The on-going ambivalence of commercial society toward the arts—are they meaningless drivel of no worth or products of a social (spiritual?) superiority to be respected even where not understood?—phrases itself in this petulant demand that the professors love these non-utilitarian objects that they make their students study

METHOD

Method, like amazon.com, is vastly overvalued. Methodology is an even bigger boondoggle. Rigor resembles nothing so much as rigor mortis.

As heuristics, methods, like disciplinary training, can open minds to new ways of thinking, to new angles of analysis. And methodology as the self-conscious consideration of the kinds of arguments being made and warrants being offered can help make the practitioner more aware of what he or she is doing. To believe, however, that methods or methodologies can either assure truth or conviction is to grossly underestimate the plurality of sources, connections, intuitions, prejudices, evidential conditions, and reasonings that play a role in any judgment of facts or values. Rigid adherents of method want to train (discipline) wayward minds and/or contain the messiness of thought and belief. The energy devoted to the enterprise suggests its similarity to efforts to hold back the sea. The leaks spring up daily and everywhere. I am not arguing that the effort has no benefits, only that the benefits are consistently exaggerated, the costs persistently under-reported.

One cost is dullness. How are you going to convince anyone if your careful, methodical work is too deadly to read? Methodological work is slow and predictable. It is “academic” in the sense of that word when applied to paintings. It keeps us going over the same ground again and again, never daring to assume anything, always required to spell out everything in excruciating detail.

Since you cannot, via care or method, guarantee your audience’s acquiescence, why not step boldly into the dialogic arena? “Often wrong, but never in doubt,” says Kenneth Burke. Let’s make that our mantra for suggestive work painted in broad strokes, aiming to provoke as much as to convince. Let’s be realistic about the various and unpredictable ways that an assertion strikes its auditors not merely as true or false, but as interesting, unsettling, inspiring, infuriating, depressing, exhilarating, boring, enlightening. When it comes to writing, energy is almost everything, method just about nothing. The goal, the trick, the difficulty is to get the spark down onto the page in such a way that it will then leap across the gap between writer and reader.

NIHILISM

It’s always the other guy who is a nihilist. The modern imagination is haunted by two recurrent figures: the hubristic man who tries to step into the power vacuum created by the death of god (from Faust and Dr. Frankenstein to various criminals in Superman comics and James Bond novels) and the depressive who can’t get out of bed because god is dead (from the Byronic hero to Camus’s Mersault and John Barth’s Jake Horner). Dostoyevsky brilliantly recognizes the two figures’ essential affinity by combining them in Stavrogin in The Possessed.

Despite our being the Prozac nation, I am more impressed (as is Bruce Springsteen) by the regularity with which “at the end of every hard-earned day,” we all once again “find some reason to believe.” Bound by the networks of daily life with its persistent demands, its structures of involvement, and its worn paths of routine, the remarkable thing is how few people fall through the cracks and can no longer go through the motions. As a way of life, modern society has felt no less self-evident, no less solid, no less necessary to humans than any other historical way of life—at least if we go by the evidence of its relentless going-on. It seems less and less like modernity is built on sand, more and more like it is an implacable, unchangeable fact. And that implacability comes as much from its apparent ability to provide humans with all the meaning they need to keep functioning as from any other factor. Reports of nihilism are greatly exaggerated.

OBSCURITY

Modernity has its winners and its losers. On the most brutal economic level, this means the starvation of the “undeveloped.” If modernity supplies them with any meaning, it is the meaning that rests in a desperate struggle for life itself. The obscurity of their suffering to the more fortunate relies, to some extent, on the geographical separations characteristic of the modern economic order. The prosperous are shielded from knowing on whom their prosperity rests. Consumers are carefully protected from suffering or even realizing the consequences (economic, environmental) of their consumption. Partly it is a matter of scale. It is hard to think through the consequences of my eating this hamburger when conjoined with 200 million other Americans also eating beef today. But there is also a concerted effort made to keep such information unavailable. Much is actively done to protect the sensibility of the consumer. The processe of manufacture are, like the poor, kept as far out of sight as possible. The complicity here is fairly complete: we (the consumers) don’t want to know, and they (the producers) don’t want us to know.

But this obscurity of consequences does not obscure the fact that there are winners and losers, and that the fate of the losers is precisely obscurity, to be shunted out of sight and left to fend for themselves. We should not underestimate the extent to which a clear vision of the consequences of losing keeps our noses to the grindstone. The “reason to believe” we find at the end of the day may often be little more than a vision of the cost of not getting out of bed to do it all again tomorrow. Nihilism is a luxury item.

All of this suggests that in the microcosm of the academy, where there are fewer jobs that those who want them and even fewer “good” jobs (the kinds of jobs which actually provide some chances of having the sort of intellectual life one got a PhD to obtain), the threat of obscurity looms large. Strategic decisions about the kind of work most likely to insure visibility must be made all along the line, and theory has made those decisions more difficult. Much evidence suggests that theoretical work is sexy. That’s what students want to study; that’s what the readers of academic books want to read. But to do theoretical work before tenure, and especially before having a job, can prove disastrous. There is some expectation in some quarters that people should do “traditional” or discipline-specific work first. But there is no consensus on this or any other topic of training or appropriate first projects. The job markets and prestige hierarchies, even within disciplines and fields, are fragmented, and thus every decision about what work to do has consequences, some of which cannot be calculated in advance. Obscurity always threatens, and the rules by which to avoid it are more obscure than ever.

POWER

After one hundred and fifty years of theoretical and all-too-real battles, four remnants of Marxism are left standing in American academic discourse (which came to Marxism very late): a certain sentimentality about class; a proclivity for analyses that locate causes at the structural or systemic level; a hopelessly confused reliance on concepts of ideology and hegemony; and a stubborn focus on power relations. Lenin’s question of “Who is doing what to whom?” may not be asked in that form, especially since our notions of power have been depersonalized, but the centrality of power to any social analysis remains one hallmark of leftist thought.

A promiscuous understanding of power dominates the current scene. Power is doing it to someone at every conceivable site in every conceivable way-—and “conceive” is the right word, because power is “productive.” Foucault, of course, reigns supreme here, and I don’t think it entirely coincidental that he was fascinated by the French tradition (from de Sade through the decadents to Bataille, Genet, and Artaud) that explores the erotics of inflicted pain.

Attention to power, whatever its genesis, seems essential to me, as does the insight that power operates in many different modes and, therefore, is contested in many different ways. No single key will unlock relations of domination. So there is nothing wrong with intellectuals in the human sciences focusing primarily on power’s discursive forms and operations. We humanists are in the symbol business, so we should consider the symbolics of power. And the past eighty years (at least) offer ample evidence of the capacity of symbols to move people to action.

But I feel compelled to articulate two further worries about discursive, symbolic analyses. The first concerns judgments of harm. Some part of me wants to insist that sticks and stones may hurt my bones, but words will never hurt me. To collapse physical and/or material harm into discursive harm creates an undifferentiated mass exactly where the ability to make distinctions is crucial. Despicable as hate speech is, it is important to differentiate responses to it from appropriate responses to physical violence. At the “deeper” level of the discursive organization of thought, it is important conceptually to recognize that some acts of violence and exploitation are not accompanied by discursive categorization of the victim as “other.” Greed, anger, and hate can be directed against my brother, even against my self. There has been a tendency to assert that discursive forms of violence, of categorization, underwrite all acts of physical and material harm.

My second worry is that a focus on the discursive too often leads to the naïve assumption that action on the discursive front will be transformative. Again, let me hasten to say that I am convinced that power functions discursively and that cultural politics has been demonstrably important on many fronts over the past fifty years. But I think we should be equally suspicious when intellectuals bemoan their impotence through marginalization and when they proclaim their corner of the universe—symbols—as the spot where the most fundamental action takes place. In other words, I accept, even insist, that power works discursively, but resist a unifying vision of power that places this discursive functioning at the ground level. Power works in myriad ways—and these ways stand in no necessary relation to one another. Not all of the ways have to be at work in any one instance, and the inter-relation among the ways will be different in different instances. Altering discursive practices has no necessary effects, and the political efficacy of centering our efforts on the discursive is never assured. There are decisions to be made every step of the way, decisions not only about what would be the most effective intervention in this case, but also decisions about what available resources make feasible in the way of action at the present moment, and what to devote attention to. There is no template that can substitute for or guide specific judgments made in relation to fallible assessments of the particulars.

QUEER

The most common assertion about discursive power is that it relies on strict categorization, on a place for everything and everything in its place. Thus the would-be challengers of this power favor the hybrid, the shape-changing trickster, the queer. That which confounds categories and crosses boundaries is thereby disruptive, if not transformative. Queer, then, stands strongly against the identity politics of homosexuality, eschewing the respectability and responsibility of a stable (albeit outlawed) desire for a mobile desire that cannot be pinned down by one name.

Queer importantly reminds us that sexual practices and desires are more various and fluid than our vocabulary for these matters admits. There may be nothing new under the sun when it comes to sex, but our language has yet to acknowledge what people are doing. Queer’s disadvantage as a politics is akin to the flaw of all anarchisms. There is no discernible or imposable direction to the fluidities queer theory wants to celebrate. Beyond the liberatory hope that people will be left in peace as regards their sexual activities, a queer politics finds itself hard-pressed to think through issues of responsibility and harm in sexual relationships, not to mention even wider issues of human togetherness in society.

RACE

The lightness of queer theory, its failure to think past the lifting of social sanctions against non-standard sexual practices, is evident once we turn our attention to race. What could be heavier, more depressing? At century’s end, the dream of integration is in shambles. The nation’s schools are more segregated in 2000 than they were in i960, and the actions of neither whites nor blacks show a deep commitment to fighting what has proved the path of least resistance. The benefits of integration have proved so elusive—hard to specify and even harder to achieve—that the constant effort required has come to seem not worth the trouble. Workplace integration has created a black middle-class, but with disastrous effects on the black community as a whole, both for the poor blacks left behind and for the successful blacks who suffer under the misconceptions and hatred generated by affirmative action. Society as a whole seems to have settled on peaceful co-existence; whites cede blacks certain spaces and a small slice of the pie, while coming down hard on every perceived threat (i.e., angry young black men) to the uneasy peace. The real achievements of the civil rights movement—the end of legal discrimination prime among them—begin to dim when contrasted with the woes attendant upon the continuing existence of blacks as a caste apart. A minimal legal tolerance of racial difference is no substitute for the interweaving of destinies which comes only from daily interaction.

Post-theoretical work should be honored to the extent that it has been obsessed with race. (Yes, white intellectuals have to be continually prodded by black intellectuals to keep race in view. But that blacks in the academy have such moral and intellectual authority already suggests a difference from how matters are arranged in other institutions.) Such work has refused to turn its face from a topic most of the country wishes would go away. Because of that wish, the obsession is more than justified. It is a responsibility. This effort to keep the intricate difficulties of race in America a continuing and continual topic of investigation, analysis, articulation, and debate exemplifies what a functioning intellectual class can—and cannot—do. Intellectuals cannot, on their own, make the nation face up to its persistent racial divides, but they can refuse to partake of the nation’s desperate attempt to ignore the whole topic.

STYLE

The last entry wavers between intellectuals and academics. No surprise: most intellectuals in America are now, perforce, academics. The opportunities for a “man of letters” (with a partial exception for novelists) to make a living outside the university have shrunk to just about zero. So many academics are willing to write for a pittance (their monetary reward will come in pay raises) that newspapers and journals do not have to pay a living wage in order to fill their pages.

The much-lamented demise of the public intellectual stems from this stern economic fact. (In any case, Britain had true public intellectuals and, to some extent, still has. America did not. Every twentieth-century American you could nominate for the role was either in Europe or spent a lot of time on campus.) Once the universities began to subsidize publication, the economic burden of supporting intellectuals was lifted from the publishing firms. Of course, the existence of a “public”—especially a paying public—for the intellectual to address has always been a problem in America. Partisan Review’s influence and prestige had nothing to do with the number of readers it reached. The New Yorker has been losing money for over ten years now. So it is disingenuous—especially for writers funded by conservative think-tanks, the only extant economic alternative to taking an academic job—to blame academics for not addressing a nonexistent public and for hiding out in universities which offer them the sole chance to have the money and free time needed to write anything at all.

The sting of the public intellectual debate comes in when one assumes that a narrowing of style is the real issue. The anti-theory crew takes the Wordsworthian position of calling for the plain language of “a man speaking to other men.” Such a commonsense language has a broad appeal and rests on broadly applicable principles of logic, reason, evidence, and nontechnical diction. Contemporary academic discourse, the claim goes, is too specialized, too exclusive.

The charge is close enough to the truth to score a palpable hit. Students both undergraduate and graduate have to be fairly carefully initiated into the mysteries of the craft before much academic prose becomes accessible to them. However, the common-sense language is hardly as broad as its apologists believe. Its standards of reasonableness and the like are no less limiting for going unnoticed.

The faults of contemporary academic style, however, have little to do with the triumph of theory and much to do with heightened publication requirements for securing, keeping, and advancing in academic jobs. The more that publication functions as the means for institutional evaluation of professors, the more such writing adopts professionally sanctioned forms. Hiring (via job markets organized by the professional associations), tenure (via the requirement of “outside letters”), and publication (via the reliance on “referees”) decisions are increasingly centralized, with authority vested in national, not local, figures and institutions. (In practice, the local is often dis valued, with publications and/or presentations addressed to local audiences at best neutral and at worst harmful to one’s professional standing.) The centralization of evaluation in the national professional community leads to increasing uniformity within fields. Departments with distinctive styles (the Chicago neo-Aristotelians) become more rare (only second-tier universities are now willing to risk oddness) as each department strives to be a microcosm of the discipline. Similarly, eccentric professors are an endangered species, since you can only publish if tuned into the prevailing questions and modes of argument in your field. Even the breaking open of the canon and the penchant of theorists to bring new texts into play have not worked very strongly against this move toward standardization. Nonstandard sources must almost always be bundled with more familiar materials, while the terms of the arguments made must be recognizable even if the text is not.

The issue, then, is not so much the difficulty of any particular academic’s writing as the pressures of professionalization. It isn’t that academic prose lacks a common-sense style, but that professional standardization works against having any style (defined as a distinctive angle of vision accompanied by a characteristic tone) at all. Apart from Derrida, Harold Bloom, and Stanley Cavell, who among the major figures of theory and post-theory could be called a great stylist, or even be said to have a distinctive style? Foucault is a great writer, but he has no particular style. Lacan had a style, but an awful one. Lyotard, Habermas, Deleuze, de Man, and Spivak are not even good writers. But lest I sound like a neo-conservative, let me hasten to remind you that the issue isn’t good writing, but a distinctive style. The neo-cons’ image of public discourse is equally flat, equally the enemy of style, although for different reasons. The academic audience doesn’t miss style, because it wants the ideas, the engagement with the on-going debates in the field. In large part, the academic reads in order to fuel his or her own writing, and thus extracts the juice and throws the squeezed fruit away. The neo-conservative dislikes style because it is excessive, ungovernable, non-deferential (to common sense or any other extrinsic standard), unreasonable, apt to rock the boat. Just think of the wide range of nineteenth-century prose styles—vatic Coleridge; dyspeptic Carlyle; the jeremiads of Marx; the lay sermons of Arnold and George Eliot; Ruskin, magisterial one moment, whining the next; pompous, sentimental, and humorous Dickens; avuncular Trollope; cynical Thackeray; accusatory Zola; ironic Flaubert; playful Wilde—and you realize how shrunken our current palette is. Who today is a great personality in our public world by virtue of what he or she writes?

One contemporary response to this vacuum stresses autobiographical writing, with an accompanying interest in “voice.” Where institutions flatten out the idiosyncratic, these academics (many of whom are women) want to recover the different through the personal. (We have another example here of an unexpected alliance between poststructural accounts of difference and an emphasis on differences located at the level of the self.) I am in favor of anything that works against the standardization of prose within or outside of the academy.

THEORY AND TRADITION

That the rise in theory coincided with a new aggression toward the tradition is contingent, albeit overdetermined. The increased demand for publication, with its insistence on novelty, makes the tradition feel like a burden while also disallowing the repetition of received truths. A publishing professoriate cannot just be the custodian of tradition. Scholar-teachers must use the tradition to generate new work. Negation of old chestnuts is the quickest path to novelty and notoriety, as Wilde and Shaw demonstrated one hundred years ago.

But we should also recognize that theory is an indispensable tool for the contemporary arriviste. Those who are to the manner born are steeped in the tradition; their sensibilities rise out of their immersion in a thousand books. As Matthew Arnold noted of the aristocrats he called barbarians and T. S. Eliot admiringly said of Henry James, such minds never rise to the level of ideas. General categories, codifications, and maps of the territory are instruments developed to aid those who are playing catch-up. Theory is a by-product of democratic pedagogy. It gives the student a handle on vast amounts of material he or she has never read nor experienced. The old-timers bemoan the ignorance of the theoretical, while the theoretical are amazed by the parochialism and complacency of the old-timers. We theorists read everything and have to publish, says my generation. But our elders point to all the things we have not read, all those minor poets who (because white males) have not benefited from the opening of the canon. Reading everything, they say, just means skimming the surface of more fields—with the resultant addiction to generalizations.

The fundamental shift, it seems to me, lies in the very goal of the whole enterprise. Formerly, the aim was cultivation, the development of a sensibility, and thus even the academic social climber had to bury any resentment he or she felt against the tradition and its institutions beneath an acquired and studied reverence. The Anglophilia of two generations of academics—pipes, sherry, and tweeds—marks this effort to become more lordly than the lords.

A more confrontational, irreverent, casual, and “authentic” style came in with the 1960s. Baby-boomer American male academics never feel quite comfortable in a tie and are never quite sure when they can get away without wearing one. Academics from this generation are no less arriviste, but are bound by a youthful oath never to “sell out.” (This sensibility and its pathos are captured perfectly in Bruce Springsteen’s “No Retreat, No Surrender,” with its suggestion that the ultimate source is the kind of World War II movie that has now been revived by Steven Spielberg after the twenty-five-year lapse caused by the Vietnam War.) Certain notions of integrity, solidarity, and authenticity now had to be reconciled with going through the institutional hoops. Politically motivated work offered one possibility, a more critical and adversarial relation to the tradition another. (Don’t get me wrong. That political work has personal motives is, for me, not a reductive dismissal of that work’s significance or potential benefits.) In sum, the heightened demand for publication, the cultural and political sea-changes of the 1960s, the increased use of theoretical mappings to substitute for particularist immersion, the idea-oriented analyses of received bodies of knowledge, and the influx of women and non-white students with various reasons to be suspicious of the canon, all combined to change the status of tradition at approximately the same time (1970-75) that French theory hit these shores.

UNHEARD AND UNSEEN

Poststructuralism’s interest in the “other” (dramatically evident in Foucault’s work on madness, hospitals, and prisons) combined with the civil rights and feminist movements in this country to focus academic attention on neglected or forgotten voices. Much of the early emphasis in African-American Studies and Women’s Studies was on “recovery work,” bringing into the curriculum and the scholarly universe texts and other materials produced by or related to nondominant social groups. In literature departments, especially, it seems that the one thing we now succeed in conveying to all PhDs is a sensitivity to what has been or is potentially “excluded” in any syllabus or academic study. An ethic of all-inclusiveness, accompanied by a scrupulous attempt to search out the unheard and unseen, rules the roost, with some ludicrous, but many laudatory, results.

Theory’s role here, to my mind, has been less positive. The problem is that much poststructuralist theory takes a strongly deterministic line, one that insists that thought and perception are products of conceptual systems that necessarily fall short of all-inclusiveness. To the Hegelian truism that something is defined in relation to what it is not, poststructuralism (in some versions) adds that we are necessarily unconscious of that thing which lies outside the borders of the defined. The “unthought” or the “unthinkable” constitutes all we are conscious of, but itself lies beyond the reach of consciousness. Yet we have an ethical responsibility to this unheard and unseen other. A hyper-scrupulosity accompanies this mysterious call of the other. We can’t (because of the necessary limits of perception) hear the sound of the tree falling in the forest, but have an absolute responsibility to respond to it. And then we get one further scruple: if we do hear the tree, that hearing will be a translation of the tree’s sound into our representational system, a translation that violates the tree’s “irreducible alterity” and thus is precisely the opposite of a truly ethical response. “The violence of metaphysics” names this will to appropriation, this persistent drive to understand things on our terms.

Frankly, the appeal of this ethics (most fully developed in Levinas and Derrida, but also evident in Lyotard, Nancy, and others) baffles me. It is not that I am firmly in the “ought implies can” school, although I do think, given all the evil in the world, that focusing on achievable ethical goals would do more good. My chief response is that this ethics seems awfully thin when confronted by the textured thickness of our interactions with actual others. Levinas has written thousands of pages on an idea that seems exhausted to me after a few paragraphs—since any specification of what the call to responsibility might actually mean in the context of lived relations to others would violate the alterity that underwrites the absolute unrefusability of this ethical demand. This ethics goes on and on about “the other,” but almost never talks of others, in what seems to me, finally, a very solipsistic or religious focus on the relationship between self and God (now renamed the other), instead of a social focus on the many relations in which we stand to numerous other people. As a result, too much of what is ethically relevant in human existence is just passed over. Since humans continually mistreat, in very specifiable ways, others who are not absolutely beyond the pale of our modes of thought and representation, I’ll gladly settle for an ethics that starts closer to home and has concrete things to say and judgments to make about particular courses of human action. To be worrying about some other of whose existence I am unaware because of the inbuilt limits of thought seems quite a luxury when there are millions of others I can see and hear who are suffering from the ills “man does to man.”

I want to lodge a theoretical, as well as this practical, protest against poststructuralist ethics. I talk of “obscurity” above, which indicates that I am greatly moved by the general concern of academics over the past thirty years for the neglected and overlooked. For that very reason, it seems crucial to me to insist that nothing and no other is necessarily beyond our capacity to apprehend or necessarily harmed by the modes of that apprehension. Poststructuralism, surprisingly, remains addicted to transcendental arguments of the Kantian sort, the identifying of necessary (usually formal) conditions underlying an activity. The notion that form is determinative has gotten way too much credence. For example, a parliamentary form of government will tend to certain effects as opposed to an absolute monarchy. But what effects will actually ensue depends (contingently) on the interaction of the form of government with countless other factors in the actual society and time of the interaction. Similarities of form are no guarantee of similarity of outcomes. Thus, to claim that the form in which an other is described can be judged “violent” and “unethical” in every single case, with no attentions to the particulars of cases, seems to me simply wrong. Such an approach also takes the easy way out, enunciating a general principle to avoid precisely what makes ethics so troublesome: the need to make differentiated judgments on a case by case basis.

Transcendental arguments also violate the rule of symmetry, or what might be called the anti-arrogance rule. The philosopher (or social critic) should not arrogate to himself an ability to discern harm not allowed to others. If we are necessarily blind to certain harms or necessarily unable to articulate certain harms, then how does the philosopher know that some harm, some violence, is occurring? If the other who we say is harmed is unconscious of the harm, then where does the harm reside? The basic principle here is that harm is a human concept, that contestation over what constitutes harm is the very stuff of ethics, and it abrogates the very enterprise by trumping that contestation in the name of a harm no one but the philosopher (and he only dimly) can perceive.

But what about the harm done to non-human others? Poststructuralist ethics, with its effort to get beyond the limits of human articulations of harm, is often seen as particularly useful for environmental ethics or animals’ rights efforts. I don’t see how an escape from the human is possible here. Ethical claims are claims made by humans upon other humans, sometimes in relation to non-human entities (the earth, the gods, animals). But until the claim has been articulated in human language, addressed to specific humans, and acknowledged in human practices, it does not take up residence in human societies.

And I, at least, would not want it any other way. Ethical claims can only be contested if they are made within the same kinds of dialogic space that enable social interactions. Poststructuralist ethics, oddly enough, aims at creating a primordial, absolute, uncontestable ethical demand (the unrefusable call of the other) below or prior to dialogic contestation. Somehow, the general claim that we are all guilty of harms we cannot even apprehend and all responsible for others we will never (and should never presume to) know is seen as guaranteeing that we will, at least, have an ethics. But I take it that the twentieth century teaches that there is no such guarantee. When the claims of actual others are so persistently ignored, the notion that the claim of unheard and unseen others will save ethics appears quixotic to say the least. Humans do evil things, just as humans construct ethical principles and make ethical claims upon one another in an effort to prevent evil. The contest between evil and ethics gives neither side an inevitable leg up—and no philosophical legerdemain can tilt this balance of power. Ethics rests on the multiple decisions made one at a time by the multiple human agents who live amidst others and their competing claims for recognition, love, care, resources, justice, freedom and the various other goods (material and non-material) that remain in all too scarce supply. There are various ways that these claims can be silenced or ignored, but no necessity that some can never be heard, and no remedy other than the persistent effort to gain a hearing in spite of the forces striving to maintain obscurity. That this conclusion will sound harsh to many of my readers only suggests to me that they still believe in some philosophical solution to, some transcendent substitute for, the endless human effort to restrain human evil.

VALUES

Not only Republicans are worried about values. There has been a general outpouring of academic books, within theory and without, on ethics, morality, and values over the past fifteen years. Professional ethics in business and especially medical schools is a growth field, even as philosophers of all stripes have returned to issues and questions that lay long dormant in the wake of logical positivism’s assault on moral statements as “noncognitive indications of preferences.”

I have already suggested that poststructuralist ethics seems concerned to combat the ethical skeptic—and takes a surprisingly traditional route (the identification of an inescapable grounding necessity) to do the trick. My view of ethical skepticism (i.e. the denial of any or all ethical claims upon behavior) is akin to my view of nihilism. It is a phantom more than a reality. I take my cue from C. S. Peirce’s critique of Cartesian doubt. It’s a parlor game (as Hume also noticed) to discard one’s commitments totally—and has no relation to how selves actually function in the world. Each person always already has beliefs and values. Those beliefs and values may change (although even that is a laborious process and probably fairly rare), but they are not shed altogether. Beliefs and values orient us in the world; they are what allow us to pick out the salient features (from the angle of vision they form) in any situation and to make decisions, register impressions, and act. A person without values would be a person without qualities.

Philosophical ethics has been far too preoccupied with trying to answer the hypothetical question: “Why have any values at all?” The more pressing question is: “How do we live in a world with multiple and conflicting values?” I believe that, as history bears out, we cannot achieve unanimity about values. Furthermore, as a matter of principle or theory, despite the dreams of some philosophers and many cultural conservatives, achieved unanimity around unquestionable absolutes seems much more dystopic than utopian. How could we wish for a world in which independent thought, a questioning attitude, and behavior that went against received opinion completely disappeared?

Yet, perpetual disagreement can only be desired when constrained within codes of civility that allow basic life-world activities to continue unimpeded. Civil war is not a condition to be wished on anyone. Some middle-ground between absolute agreement and absolute discord is the goal—which is why ethics as an enterprise must always resist simplistic solutions. We need two opposite poles—contestation and agreement—and in a proper balance that is contingently related to various other factors (such as degrees of economic inequality) in any particular situation.

I want to note two complications in prevailing attitudes toward values among contemporary academics. A significant plank of any liberal ethos, and the central plank of Kantian ethics, is the value placed upon individual autonomy. Contemporary theory usually takes umbrage at all things liberal, and has significantly and convincingly argued that selves are not as autonomous in their values and their decisions as liberal accounts imagine. Even if we adopt a fairly integrated view of the self as a bundle of beliefs, values, memories, and habits carried through time, that self is constructed through social processes that shape its most fundamental commitments. (In fact, many contemporary theoretical accounts make it hard to account for differences among the products of these social processes.) Yet even as autonomy is critiqued as a fact and as an ideology, almost all academics honor it as an ideal in their practice as teachers. Most of us believe it is an outrageous abuse of our power to insist (for grading purposes) that students agree with our opinions or values. More generally, most of us scrupulously strive to give our students the tools to think for themselves, rather than supply them with certain content as unquestionable truth. And in even the most radical and anti-liberal theoretical work, a bottom-line autonomy of selves is almost always assumed as among the ethical and political goods being sought. My point is not to claim that such work is hopelessly confused, nor that liberal values hold a universal allegiance even among those who claim to dispute them (Habermas seems to believe something like this), but to suggest that the ethical good we seek (that balance between absolute unanimity and dysfunctional disagreement) does not translate easily into being simply for or against autonomy.

Similarly, an ethic of all-inclusiveness is too simplistic. Differential judgments will be made all the time; ethics says we should justify those judgments. A call to avoid all such judgments can only be made in a discourse that is safely separated from the real world. The writers producing such calls are almost always involved in deciding whose work gets published and who will be admitted to their graduate programs. It is hard to avoid the sense that many academics pride themselves on keeping their hands clean—and they do so only by ignoring their own daily acts of judging (grades, for starters) and by leaving the dirty work to be done by others. Yes, many of our excluding judgments are outrageous. But I am convinced that the proper ethical response is not some general condemnation of all judgments. Rather, ethics involves the explicit examination and articulation of our values as they are lived out in our judgments. Every judgment is accountable to those values, and every value should be open to contestation by others. The scene of judgment and its evaluation by others needs to be as public as possible. Obscurity here, as elsewhere, serves the privileged (those who have the power to exclude) better than the vast majority.

THE WORLD-WIDE WEB

My department hired two assistant professors in instructional technologies and I felt intimations of old fogeyism. Here, I thought, is the first new thing coming down the pike on which I will pass. It’s all well and good for them, but I can see my way safely to retirement without having mastered or used the Web, being ignorant (and proud of it) of chat-rooms, and never having to teach a distance-learning course.

Two years later, I am not so sure. My essential activities—read, think, discuss, talk, write, grade papers—are starting to look a little different. I don’t think computers will change everything. But they will change some things, are doing so already, and academics (even in the humanities) are not going to be able to hold out much longer.

The predicted impacts on written work have been slow to materialize. But I like the formal possibilities, especially for tiered texts. An overture would hit all the main themes, and then readers could click on various items to get fuller expositions, supporting arguments, references, and even full source materials. As a writer, the flexibility of different organizational strategies appeals to me. Since I am one of the few people I know who is addicted to reading books from cover to cover, I’m more wary as a reader. My reading habits may be less flexible. I find contemporary magazine (and grade-school textbook) layout, with its side-bars and boxes (anything to disrupt continuous reading over several pages), deeply annoying. I crave the consecutive when reading, even while finding it less than satisfactory for much of what I want to say when writing. Hence tiered texts of discrete, consecutive parts make sense to me.

New publishing formats—digital, on-line, or otherwise—strike me as neither here nor there unless they shift possible audiences. To put something on the Web strikes me as equivalent to putting it in a drawer, not because I care about the refereeing process, but because there is no targeted, no designated, audience. The academic who publishes a book that sells 800 copies can still feel read if that book is taken up by his or her field. There are, in other words, all kinds of institutional venues for the book. By contrast, the novelist who sells 5000 copies will much more likely feel his book has sunk without a trace, perched on several thousand public library shelves. (Of course, many academic books also generate that feeling, but not if they sell 5000 copies.) The Web is so amorphous, so unorganized, so (in a word) a-institutional, that, despite its touted dialogic capacities, publishing on it seems a wanton disregard of the desire to reach an audience for an academic like myself who has gained a place in the institutional conversation. The time and energy required to write are hard to summon—and the effort is driven (once the institutional ladder has been climbed) by the urge to connect with the reader. I cannot imagine the reader provided by the World Wide Web. This imaginative incapacity on my part defines the gap between me and the younger scholar who is Web oriented. I ask “why bother?” where my younger colleagues see the very place in which they want their work to appear. Of course, one usually assumes the visibility of pieces that appear in the places one reads. When I start using the Web more, perhaps I’ll start believing that others like me will see and read what’s on the Web. I can’t predict if that will happen. My old fogeyism hangs in the balance.

XENOPHOBIA

The next person who says, knowingly, that “Derrida and his ilk haven’t been taken seriously in France for years” should be condemned to a year of reading only Lacan—in French. The same anti-theory zealots who derided American academics for slavishly granting authority to all things French now appeal to the authority of reported French disdain for all things poststructuralist. The national identity of ideas is irrelevant; the idea of national identity needs to be exploded.

A YAULD YIRR

I’ve ended up with something between a rant and an essay. I began with the hope of conveying why my reaction to theorists and anti-theorists alike is so often “a plague on both your houses.” I don’t want to be a curmudgeon, or one of those smug and nasty types who congratulate themselves on speaking home truths no one wants to hear. (These supposedly lonely truth-tellers, from Allan to Harold Bloom, have consistently found larger audiences than the conformist cowards they sneer at.) I take it that academics (especially) have a duty to be optimistic, since pessimism is the easy road to doing nothing, to taking the world’s ills in stride. So if I yirr (snarl or growl as a dog does), I must be yauld (active, vigorous) about it—and in the service of an active moving forward. What really irks me are tunnel vision, narrowness of scope and purpose, and disdain for work that explores different questions, follows different protocols, and has different aspirations from one’s own. Relieved of the most serious threats to life and granted the space and time for reflection and inquiry, academics should respond by opening up the vistas of themselves, their colleagues, their students, and whatever audience they can reach beyond the academy. Theory, broadly conceived, has encouraged such opening more than it has shut it down. Evidence of active, challenging, expansive academic work is all around us—and that’s cheering news.

ZARATHUSTRA

The temptation is to close with prophecies. Theory has brought to academics the anxieties of fashion. What is the next new thing? Will I look outdated? How do I stay ahead of the curve?

In fact, the pace of change seems to have slowed down. We seem to be in a phase of assimilating, sifting through, and putting into practice the various new concepts and approaches theory suggests. There has been no “big” theoretical book that “everyone” must read since Judith Butler’s Gender Trouble and Eve Sedgwick’s Epistemology of the Closet, both of which are now more than ten years old. Another—more troubling—sign is that there are no forty-something European figures who are as known or read now as Habermas, Foucault, and Derrida were at that age. This fact is troublesome, and reflects a broken tie with Europe, severed by the retirement of the émigré generation that staffed American universities after fleeing Nazi Europe. Returning to the mono-lingualism of seventy years ago (made only marginally less isolating by the ascendance of English as a global language), American universities may now simply be less capable of attending to foreign-born ideas than they were thirty years ago. It is not as if the loss of the European connection has been accompanied by any great strides toward connection with the rest of the globe. Intellectual globalization is barely an idea, and nowhere near a reality, irrespective of economic developments.

But the absence of new Derridas and Foucaults on the scene also suggests a pluralism that seems both positive and abiding. For a very short space in the 1960s (during the student movements that did sprout up all around the world) and an equally short time in the 1980s (when the term “postmodernism” did seem to capture some essential features of the time), our era fleetingly possessed a unity, an identity. But these moments dissolved into times whose varied characteristics are belied by any overarching designation. No one figure represents in himself or in his work our era; no intellectual movement speaks to every aspect of our “condition.” There is too much going on in too many different places. Such pluralism makes it foolhardy to predict what the future will look like, what will win out, what fade away. I think that the best we can hope (and should work) for is that pluralism itself is our future, that no one of the various viewpoints competing for attention manages to win general acclaim and crowd out the others. Which just might be my own way of proclaiming that god is dead.

Additional Information

ISBN
9781501720963
Related ISBN
9780801487668
MARC Record
OCLC
1031885542
Pages
74-113
Launched on MUSE
2018-01-15
Language
English
Open Access
Yes
Creative Commons
CC-BY-NC-ND
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.