publisher colophon

CHAPTER 7

Toward a Pragmatist Pluralism

I have recommended pluralism often enough in the preceding chapters to owe my reader a fuller account of it. The topic is one that mostly still lies out in front of me, waiting to be explored. But I will try here to indicate some of the intuitions that make me want to light out for that territory and some of the topography’s looming features. These are “notes” toward a fully articulated position. The writers I take as my guides are hardly the only pluraliste out there; they just happen to be the ones I have been influenced by.

Let me begin, for clarity’s sake, by identifying five variants of pluralism in the intellectual tradition. I would like to be faithful to all five of these but will admit that they are rarely considered together, and I have not yet worked out if they can really all be held together consistently. As Nicholas Rescher puts it, “Even pluralism itself—the doctrine that any substantial question admits of a variety of plausible but mutually conflicting responses—lies open to a plurality of versions and constructions” (1995,79).1 It might be more faithful to the spirit of pluralism to see them as incommensurate; certainly they occupy very different discursive universes. The “pragmatic” qualifier in my title indicates not just that I come to pluralism through pragmatism, but also that my account of selves in complex situations comes from pragmatism. William James is the guiding spirit here and we may take two passages from Pragmatism (1975; originally 1907) for beacons: “The pragmatism or pluralism which I defend has to fall back on a certain ultimate hardihood, a certain willingness to live without assurances or guarantees” (290), and more epigrammatically, “Nothing outside of the flux secures the issue of it” (125).

Pluralism 1 is found in Ludwig Wittgenstein and J. L. Austin. This is the pluralism that inspires Wittgenstein’s epigraph for Philosophical Investigations (1958): “I’ll teach you differences” (from Shakespeare’s King Lear). Wittgenstein and Austin were interested in the variety of different ways we use language and were especially committed to overcoming the positivist tendency to see one kind of statement—assertion—as primary and/or as the ur-form of “sense.” Humans do a lot of different things and, thus, they use words in many different ways. The plurality of doing and saying should not be reduced when we attempt to describe, and especially to explain, this multitude. “Everything is what it is and not another thing,” says Bishop Butler (in Ignatieff, 1999,51), but it proves very hard not to convert things into manifestations, effects, parts, stages, or appearances (deceptive or otherwise) of other things, once we begin intellectual work. The metaphor of “family resemblances” is one way Wittgenstein tries to salvage particularity, even when similarities and relationships to other things are acknowledged. As a son, my identity is significantly shaped by my relationship to other family members, and specific similarities can be noted. Pluralism will insist on the existence of many different shades of relation, each of which qualifies the particular in different ways. And it will resist the tendency of systematic thought to build ever larger networks of relation that subtend all the particulars in view. Austin offers a “general warning in philosophy. It seems to be… readily assumed that if we can only discover the true meanings of each of a cluster of key terms, … then it must without question transpire that each will fit into place in some single, interlocking, consistent, conceptual scheme. Not only is there no reason to assume this, but all historical probability is against it[.]… We may cheerfully use, and with weight, terms which are not so much head-on incompatible as simply disparate, which just do not fit in or even on” (1979,203). The difficulty of thinking “disparateness” should not be underestimated.

Pluralism 2 is Nelson Goodman’s notion of multiple possible adequate descriptions of a given situation. Goodman’s pluralism is pragmatist in the William James/John Dewey tradition, with an emphasis on appropriate and possible responses rather than on adequate descriptions. “For a categorical system, what needs to be shown is not that it is true but what it can do,” writes Goodman (1978,129). Different vocabularies enable different actions in the world, and since our actions re-form the world, Goodman speaks of multiple “ways of worldmaking.” Even if we accept that external circumstances limit the available options of speech and/or action, those circumstances never dictate one, and only one, possible response. And Goodman insists that circumstances must be understood as worlds constituted by prior human actions. He eloquently sums up his position: “The many stuffs—matter, energy, waves, phenomena—that worlds are made of are made along with the worlds. But made from what? Not from nothing, after all, but from other worlds. Worldmaking as we know it always starts from worlds already on hand; the making is a remaking …. My interest here is … with the processes involved in building a world out of others. With false hope of a firm foundation gone, with the world displaced by worlds that are but versions, with substance dissolved into function, and with the given acknowledged as taken, we face the questions of how worlds are made, tested, and known” (1996, 65).

A third pluralism can be attributed to Hannah Arendt, who stresses the “plurality” that stems from the existence of many distinct individuals. Arendt’s key term in this context is “natality” (taken from Augustine). Something new comes into the world with the birth of each individual; similarly, human action, performed by individuals, brings new things into the world. She calls action a “miracle,” by which she means to suggest that the appearance of novelty both exceeds calculation and is an embarrassment to theory. “Every act, seen from the perspective not of the agent but of the process in whose framework it occurs and whose automatism it interrupts, is a ‘miracle’—that is, something which could not be expected” (1977, 169). “[W]e know the author of ‘miracles.’ It is men who perform them—men who, because they have received the twofold gift of freedom and action can establish a reality of their own” (1977,171). Humans make both the world and their selves in political action, says Arendt. A commitment to plurality undergirds Arendt’s advocacy of a politics that enables the appearance in public of that full individuality which only discloses itself in action before others. But plurality also grounds her basic ethical principle: the reduction of individuals in all their unique difference to types, to instances of general categories, is a violation of their freedom. And Arendt sees a direct link between the failure to cherish plural singularity, a failure that renders the individual “superfluous,” and the violence done to whole peoples under general names like “Jew” or “enemy of the revolution.”

Isaiah Berlin provides a fourth pluralism, one that focuses on the notion of competing goods. Trade-offs, compromises, and negotiations will always be necessary both because different individuals will prioritize competing goods differently and because many choices are painful second-bests. Berlin’s (1969) pluralism is connected to a variant of liberalism that eschews overarching, systematic solutions to social problems in favor of context-sensitive, ad hoc reactions that claim no authority or “correctness” beyond allowing social agents to “go on” (Wittgenstein’s phrase) in relative peace and prosperity until the next adjustment is required. All solutions are imperfect compromises that hold only so long as the various parties to the compromise are satisfied enough to restrain from rocking the boat, from demanding a renegotiation of the prevailing terms. I find Berlin the least attractive of all the writers I have mentioned thus far because I think he underestimates the extent to which power holds people to compromises they loathe. So I distrust his reliance on negative liberty, on the individual’s ability to withhold consent. But I also don’t know Berlin’s work as well as that of the other writers, and I am attracted to a view that stresses competing, incompatible, and disparate goods that vary from situation to situation. Berlin, of course, offers one version of liberal pluralism, a version that accords selves more sovereignty as agents than I think they actually possess.2

The fifth variant would be the methodological pluralism I am groping toward in chapter 5. Method is certainly too grand a term; it is more like lines of inquiry or characteristic ways of approaching a problem or topic. The goal is to shift our focus from determinate identity, from what a thing or set of relations has been or is to what it enables, to how we “go on” from here, to what actions it makes possible. “Things and relations are not read in terms of something else or in terms of where they originate or their history but rather, pragmatically, in terms of their effects, what they do, what they make” (Grosz, 1994,181). As I suggested in chapter 5, Foucault, James, and Arendt all offer hints; other helpful sources of ideas on this score are Paul Feyerabend (1993) and Barbara Herrnstein Smith (1988). The trick is to avoid the Scylla of “methodological individualism” (characteristic of much quantitative social science) with its assumption of sovereign individual choice and the Charybdis of Hegelian holism, which subsumes all particulars under the sign of the general, the system. Relations are crucial (although not the sole) determinants of meaning, as structuralism and other systematic paradigms insist. But relations are contingent, do not necessarily concatenate into ever larger systems of connections, and work upon entities that have substantial properties of their own (what Spinoza called connatus in human individuals).3 Things (persons) are qualified by the relations in which they stand to other things and/or persons, but “constituted” may be too strong a word if we allow it to suggest complete plasticity. There is resistant matter in things and persons; they are not infinitely malleable, as anyone trying to “socially construct” a two-year-old knows. Pluralism searches for a methodology that credits that resistant something without erecting it into the particular’s identity and/or essence. The method also has to register how things change, often dramatically, when placed in new relations, new contexts.

Pluralism, simply, sees a world that is full of many different things, of many different contexts (or assemblages of things in relation to one another), and a variety of vocabularies that humans use to position themselves among those things. I want, in the rest of this chapter, to untangle further characteristics of pluralism and to consider some of its consequences. Since I am not ready for a systematic account of pluralism (if such a self-contradictory undertaking is even desirable), what follows is more a set of illustrations meant to flesh out what pluralism claims and where it leads us. Needless to say, the illustrations are meant to be persuasive.

To begin, I want to suggest an interactional model of situations. The pragmatist model of action starts with an individual in a situation. The contrast is to what Dewey (1981,26) calls “the spectator theory of knowledge,” which posits a knower distanced from the objects to be known.4 The pragmatist self is always already embedded in situations, always already within a society and a culture, always already located in a world that acts upon it and upon which it acts. Knowledge is a by-product of this immersion, not something constituted prior to it or separate from it. At first—and for the most part—the individual acts habitually, minimally conscious of her routine responses within an environment. Matters only get interesting when the routine fails to achieve its usual (expected) results.5 The individual is pulled up short. An element of doubt has been introduced by the recalcitrance of the world (world here encompasses other people, objects, and institutional arrangements and relations, as well as the agent’s own body.) The agent must reconsider her habits and her options. She performs an “inquiry” (Peirce’s and [sometimes] Dewey’s word) or “reconstruction” (Dewey) that leads to a reassessment of the situation and thus an altered relation to it. What was habitual, routine, unconscious becomes considered, reflective, conscious. Crucial is the insistence that knowledge and action both entail the maintenance or alteration of the self’s relation to the situation, of the self’s way of being embedded in the world. Hans Joas (1996,133) calls this pragmatic picture of a self responding to its surroundings “situated creativity,” the re-vision of possibilities and strategies in relation to the demands of novel situations.

This account, so far, is hardly at odds with rational choice models or other forms of methodological individualism, including some versions of liberal pluralism. But the pragmatist emphasis on habit does posit large domains of what might be deemed prerational behavior. We will do many things habitually in life—and that’s a blessing. When we think consciously about breathing, we usually muck it up. But habit, while always present and sometimes necessary, is never sufficient. And it is when pragmatism turns its attention to the formation of habits, to their insufficiency, and to the processes of their reconstruction that it departs significantly from individualistic models. For a start, habits themselves are not individually generated. To a certain extent, habits respond to the world’s regularities. Nature is lawlike because various configurations and events recur. Social arrangements also attain relative stability (relative because no social arrangement lasts forever and because the stability in question may only manifest itself in some circumstances, while not in others). Habits are mapped onto these stabilities and regularities. Routine action generates the expected results because the situation of today is not very novel in relation to yesterday’s situation. Actions that have gained the desired end will be repeated until they fail. The world is such that many repeated actions do not fail, so many actions become habitual. Habits, thus, are products of the relations to the world, to others, and to society in which the individual stands, not individually generated.

The pragmatist definition of “world” and “situation” is not limited to an individual facing a nonhuman environment. Because these terms also encompass others and social arrangements, the interactional model cannot be dual (self facing nature) or even triangular (self facing nature and other selves), but quadrilateral (self facing nature, others, and social arrangements). The environments within which we act are (not always, but much of the time) human-made as well as natural, and the results at which we aim include the desired response (approbation, love, obedience, cooperation, to name just a few) of others and (sometimes) the maintenance, reform, or contestation of the social arrangements that structure our relations to others and to nature.

In other words, pragmatism (especially in Dewey and George Herbert Mead) understands habits as socially produced, understands the relations in which agents stand to the other elements of a situation as socially mediated. Habits are not simply individualistic responses to the world; they are also socially instituted, reinforced, and transmitted. Many habits are acquired through a slow process of education. We reach here the place where habits become fuzzily related to norms or ideologies. The unconscious routines of individual agents are acquired through experiences that are not solely individual but are, at least to some extent, social. With Mead we get a pragmatism fully committed to the insistence that the individual herself, the self as the unit of action and organized experience, is socially constituted.

From methodological individualism to a socially constituted self, pragmatism may seem condemned to swing from the fears of fragmentation and anomie that characterize subjectivist interpretations of modernity to the visions of lock-step conformity and social engineering that mark the dystopian visions of Huxley, Orwell, Adorno, Foucault, and other critics of totalitarian, mass, or disciplinary society. Joas is absolutely right to identify “situated creativity” as the talisman that allows pragmatism to escape these unpalatable choices. The analysis of habit is crucial, because it avoids any presentist model of the individual encounter with the situation. The individual enters the situation with a set of habits, beliefs, predispositions, established relations to self and others. The individual is, in a word, experienced—and carries the way of being in the world that experience has forged. The individual is not the blank slate that individualist accounts like rational choice theory or extreme Nietzschean versions of the willful self posit at the momentous instant of action. The pragmatist self has a past—and is oriented toward a future. Action in the present is deeply informed by that past and that future.

Furthermore, the situation itself has a past; it, too, is not simply present. Peirce’s semiotics are indispensable to pragmatism’s portrait of the situated individual as the site of knowledge of and action in “the world.” The individual cannot act in, respond to, a situation unless that situation is named. In other words, action for the pragmatist is conditional on a judgment about what situation I find myself in. And judgment is not purely perceptual (and thus presentist: what I see, feel, and hear now), but also linguistic, conceptual, categorical (Peirce took the word pragmatism from Kant). My judgment processes the raw data and raw feels of the present through the lenses of available vocabularies. The generalizing, categorizing, classifying property of names is crucial. The novelty of situations, the newness of the present, is tempered by judgment.

Crucial to pragmatist pluralism is the denial of infallibility or the singularly proper to this process of judgment and its results. There is a tension between the novelty of what is here now and these generalizing categories carried in our language. Multiple ways of characterizing a situation are possible, each of which singles out a different way of “going on” from here, a different way of aligning our relationship to the other components of the situation. In Shelby Foote’s history of the Civil War, he exasperatedly tells us that one year to the day that Robert E. Lee scored his greatest victory of the war by turning Joe Hooker’s flank at Chancellorsville, Lee turned both of U. S. Grant’s flanks in almost the exact same place in the battle now called the Wilderness. Grant was routed worse than Hooker was, Foote insists, but simply failed to acknowledge he was defeated.6 Now I think it fair to say that Grant was obtuse. The wonder is that Grant’s obtuseness was just about his greatest virtue. Or we might say that his vices became virtues in this particular situation. Abetter man would have handled the situation worse. If the fact of defeat was a reason for retreat, Grant would look right past the facts and make the situation tell a different story. Famously, the Union troops cheered when they turned right after extricating themselves from the battlefield—right to move further south, rather than left to cross the Rapahannock River and return north.

To invoke a useful term from David Wiggins’s work (1998,124-32), the facts of the situation “underdetermine” judgment and the ways that individuals will respond.7 Grant’s reading of his situation, while not conventional, was possible. The facts do not rule out his chosen course of action. Grant’s obtuseness extended to other people as well; he could stomach the slaughter of his soldiers, while the battle of the Wilderness (like the earlier battle of Shiloh) resulted from his narrow focus on his own plans to the neglect of imagining what the other army might do. But that same obtuseness enabled his unconventional judgment of his situation in May 1864. Neither the facts nor Grant’s obtuseness were all-determining. There were limits to what he could achieve, but those limits did not reside in one or the other component of the situation. The limits only became apparent in the interplay of all the components as a judgment was acted on and its consequences unfolded. Even Grant had to acknowledge defeat at Cold Harbor.8 The world bends to will no more predictably than a two-year-old child does. Underdetermining facts are not irrelevant, but neither do they tell one and only one story. They can be read in different ways and there are often many successful courses of action open in any situation.

The Peircean point is that we could not act at all if we did not take the judgmental step of assimilating (through an imaginative leap that processes similarities, analogies, and formal symmetries/asymmetries) this singular present to situations already experienced. The pragmatist emphasis on experiment, on trial and error, acknowledges the highly problematic nature of our judgments. We should always consider these judgments fallible. They are preliminary hypotheses, the first guides to action, but always tentative, always to be revised in light of action’s results. William James begins his book, A Pluralistic Universe, by pointing to this tension between the need to name things, to assimilate the singular under general categories, and the inevitable inadequacy of any one naming, since some aspects of the singular thing will not be highlighted. No responsive action exhausts the potential of a situation. There are always different things we could have done, different opportunities we could have seized. Still, we have a tendency to take our namings as adequate, to neglect pluralism in favor of definitive assertions, so James posits an endless tension between the singular that solicits plural ways of responding and the generalizations that aim to fix that fluttering thing. “Individuality outruns all classification, yet we insist on classifying every one we meet under some general head. As these heads usually suggest prejudicial associations to some hearer or other, the life of philosophy largely consists of resentments at the classing, and complaints of being misunderstood” (1987,631). Austin (1975) points toward this same tension in his wry comment that “we must at all costs avoid over-simplification, which one might be tempted to call the occupational disease of philosophers if it were not their occupation” (38).

Peirce’s semiotic is so important to a pragmatist pluralism because it factors in the social mediation that informs all human encounters with the world without simply locking us into the prison-house of language. While past experience and preexisting (social and linguistic) categories are crucial to our forming judgments in the present, such judgments do not preclude our processing feedback from the real. The expected results of action can fail to occur; we can register the fact of that failure, and we can revise our judgments, beliefs, and habits accordingly. Pluralism, then, resides both in the situation being capable of different descriptions that lead to different responses and in the refusal to accord any component of the situation (facts, self, others, or social arrangements) full determinative power.

But even this model of an agent judging a situation in the vocabularies afforded by social categories is too simple. We must also recognize that situations come to us already named and that we judge them in relation to future goals. Here the pragmatist vision of temporality joins with its persistent interactionism and pluralism. As William James memorably puts it, “the trail of the human serpent is thus over everything.” (1975, 37). A situation and the elements of which it is composed are not pure percepts because they come to us bearing the histories of their previous relations to humans. Things—and, more generally, the world in the fullest sense of that term—bear the traces of their previous encounters with agents. If one manifestation of culture is transmitted habit, another is this overlaying of history and meaning (carried within language and tradition and serving as assumed background knowledge) that accompanies things. Human action alters the world, and now our relation to that world is mediated through these prior alterations. Situated creativity, then, involves an individual agent, but with an individual and a situation that are both deeply embedded in cultural codings that carry the experiences of the past and motivational/normative orientations toward a desired future. Since the actions of the past and possible orientations toward the future are multiple, there is no single right judgment about a present situation. The present situation affords various possibilities, although not infinite ones. Plural judgments leading to different courses of action are to be expected. That is why creativity is both possible and prized. Successful action is most likely (although there is no guarantee; the world can be, and sometimes is, perverse) when a judgment attentively responds to the various elements in the situation. But few actions will work upon all those elements, while different purposes may be successfully pursued in the same situation. Both the complexity of present situations and the very different pasts that experienced selves carry into situations lead the pragmatist theorist of action to expect multiple judgments and actions in the present and to expect that more than one judgment or action will prove adequately responsive to present possibilities.

This pluralism of response and possibility is meant to counter models that court social or any other kind of determinism. Pragmatist interactionism is another plank—the most basic, ontological plank—in this refutation of determinism. Pragmatism identifies four elements (agent, other people, material things, social meanings and arrangements) in any situation and insists that none of these elements is determinant. Each element has no independent standing, but is an interactional product of the encounters among all four. The identification of the four elements is an ex post facto result of theoretical analysis that rather falsely suggests an independent existence for each one. The ontological claim is that the four only exist (for humans at least) in interaction with each other. The dynamic, ongoing, and inescapable intertwining of the four through time is the environment in which humans find themselves. The human organism thus embedded is continuously adapting to the circumstances of being in the world. Attempts to indicate the causal contribution of any of the four elements to the creation of the situation belie their mutual dependence, the fact that each can only ever act in conjunction. We do not encounter or know any of these four elements in isolation or even in some nondynamic moment of inaction. The pushes and pulls of their coexistence are constant. The world just is the interaction of these four (this is the ontological claim)—and no one of the four is predominant; none gets to call the shots unilaterally. Adherence to this model of complex interactions suggests, in Feyerabend’s words, that “the dichotomy subjective/objective and the corresponding dichotomy between descriptions and constructions are much too naïve to guide our ideas about the nature and the implications of knowledge claims” (1999,144).

From this pragmatist perspective, almost all theories of knowledge, judgment, and action are reductive, taking one or another of the four elements as determinative, and thus reducing all action to a reflection of individual temperament (subjectivist psychology), social and cultural coding (ideology theories), natural facts (realism), or the pressure of immediate others (social mores and sanctions). Each of the elements, the pragmatist insists, underdetermines what is now and what will unfold as the dynamic situation moves into the future, Underdetermination is one reason predictions of action are so unreliable. At the very best, some statistical regularities may be identifiable. But individual predictions are hit or miss. The variables are just too complex, since their limited number (four) is joined to the indeterminacy of just how much weight any one carries in any particular situation. Analysis after the fact can offer plausible accounts of how the variables interacted to produce a specific action and specific results. But even these analyses will occupy a realm of plausibility, not exactitude, and will be subject to the pluralism that stems from the different possible ways to name a situation and the different possible identifications of its consequences. So, for example, I think “obtuseness” captures something about Grant that provides a plausible account of his actions, but other interpretations, other judgments of his behavior, are certainly possible.

Situated creativity, then, calls on us to focus on the unexpected and novel ways that a particular person goes about responding to a particular situation. Of course, Peirce and Dewey were both interested in communal enterprises; it is not entirely clear how much this basic model of action would have to be modified to account for communal creativity. Physical possibilities, normative expectations (which carry a range of sanctions if violated), and institutional arrangements all structure the field in which action takes place. But the wild card of the agent remains just that; there are multiple ways to act within a situation—a fact that becomes especially relevant when activity in a field is most lauded when not (fully) routine or predictable. We value novelty and difference more in some fields than in others, but our bias toward individualism seldom leads us to praise slavish imitation and complete predictability.

William James famously shook off years of depression when he put the specter of determinism behind him with an act of sheer assertion.9 An arbitrary and gratuitous action, by virtue of its occurrence, disproved determinism. Appeals to the wild card of agency can look similarly ungrounded. Recuperation of the singularity of creativity within the generalizing vocabularies of theory is never elegant. To a large extent, the argument rests on empirical observation. Humans continually do unexpected, unpredicted things, and humans also demonstrably alter established routines in response to altered circumstances, altered goals, or altered interpretations. It seems odd to be called upon to prove that one situation differs from another just as one person differs from another. Yet the tendency to assimilate these differences within frameworks that group singularities according to similarities is so strong that pluralism is often on the defensive. But surely our theoretical inability to account for creativity and its plural effects says more about the limitations of our theories than it does about the actual capacities of human agents or the nature of the varied worlds they fashion in interaction with others, things, and cultural meanings.

The pragmatist understanding of situated creativity brings one final embarrassment in its wake: the assumption that agents are capable of monitoring the world and of reflexively processing the information received. In other words, a theory of creative action entails a (however minimal) bottom-line individualism. There must be a point, even in a fully interactionist theory, where the self cannot be reduced to a function of forces external to it (or even of forces “internalized” through some process of socialization). That point in pragmatism focuses on the self’s ability to read (to judge) situations. The pragmatist model cannot survive an “error theory,” that is, any account of behavior which places the self’s ability to know what it is doing into radical question. Pragmatism depends on the fundamental trustworthiness of consciousness (perhaps not immediately, but at some level of reflexive process). Any theory that posits unconscious processes as more constitutive of action than conscious choices cannot be compatible with the pragmatist outlook. If habit is not amendable in response to experience, pragmatism is a non-starter. The pragmatist must be hostile to theories of ideology that posit motivations and intentions unavailable to consciousness as the determinants of action. Pragmatism depends on agents who can, for the most part, know what they are doing. The pragmatist need not deny systemic relations and/or effects, just as he hardly ignores inherited social codings, but must deny that agents are systematically and incorrigibly unable to perceive and take into account these relations, effects, and codings. The strongest argument here is that the theorist of ideology has achieved a conscious understanding of these matters. What, in principle, could refute the possibility of all other agents’ attaining a similar understanding?

The notion of ideology highlights that there are social heuristics for grasping situations, pre-established maps for how to proceed when meeting situations of this or that type, along with guidelines for seeing that it is this or that type that we face now. Novel interpretations that fly in the face of these heuristics must overcome not only the inertia of habit but also the skepticism of others who are prone to follow convention. The extent to which received categories determines judgment is overstated by most theories of ideology, but that does not mean that the problematic of ideology is false.10 We process the real according to forms that are neither entirely self-generated nor easy to revise. And even when we manage to break through the crust of convention, we still have the difficult task of persuading others to accept our novel reading of the situation.

The argument against ideology theories is that selves in a culture do not all judge situations in the same way and that experiences of new circumstances can change the terms and categories we bring into situations. Change does happen; our “defaults” are transformed by living a life. Because situations are both complex and novel, there is nothing beyond responsiveness to the particulars of the situation and a knowledge of the semantics of available general terms to guide our namings. Judgment takes place in a setting that is chronically underdetermined, which is precisely what ideology theory, with its emphasis on overdetermination, denies.11 Realism, with its search for the “right” name, also denies underdetermination. Judgment is not an exact science; its inevitable reliance on analogy links it more to the poetic faculty as described in Aristotle’s Poetics than to any Adamic notion of a proper naming.12 Not surprisingly, we get disagreements over labeling all the time. Such disagreements are endemic to pluralistic societies in which selves are encouraged to take individualized viewpoints and in which there are various traditions, various cultural orientations, on which individuals draw.

What consequences follow when someone like Dorothea Brooke refuses “to call things by the names that others call them by”?13 In art since 1750, we expect and value idiosyncratic namings. We encourage defeating expectations, strive for surprises. Beyond the pleasure of novelty, theories of art since the Romantics have often claimed psychological, social, or moral benefits from the poet’s ability to find new names for things. Kenneth Burke, as I discussed in chapter 6, finds these novel namings magically transformative. A new name opens up entirely new possibilities; it is as if a charm has been undone. We suddenly see a way forward that we did not see a moment ago. Such conversion experiences, the sense that “I once was blind, but now I see,” capture the exhilaration that attends both acting in and witnessing the drama of creation. Pragmatism shares the Romantic admiration of Prometheus. Humans can re-word the universe, thus altering the received world to fabricate a new one better adapted to human needs and desires. The world admits of plural outcomes and human ingenuity is called to direct the stories down the best possible paths. Pluralism goes hand-in-hand with a heady freedom and with viewing “creativity” as a god-like capacity that should be cultivated and given every opportunity to “express” itself. The plot of history has not been written. The underdeterminative facts can be like putty in our hands. Human desires and imagination are not futile; they can be realized in the here and now. Apparent constraints are more likely psychological (fear or some self-limitation of will and vision) or social (the conformism and lack of imagination of the herd, according to Nietzsche, or blocking forces of coagulated power identified by leftists) than natural, inevitable, or “real” in some human-independent way.

Visions of such absolute freedom have proved terrifying as well as heady, and pragmatism outlines a “situated freedom” for reasons similar to the account of “situated creativity” offered above.14 For every Prometheus unbound there are five Fausts, characters who come to grief when they find themselves in a world without limits or constraints. Even Nietzsche has to posit “eternal recurrence” to structure what otherwise looks like the formless chaos of total freedom. Romanticism has proved more attractive as an idea than as a daily lived reality.15 Part of me, I must admit, regrets the continual compromises with total freedom, the careful stepping back from the brink of asserting and living the conviction that everything is possible and we need seek no one’s permission but our own. We only require the courage and creativity that such freedom calls for. The fault lies in us, not in the stars. We have proved incapable of drinking the cup of freedom to the lees. But perhaps others—the overman of whom Nietzsche dreams?—will succeed where we have failed. There can be humans who are Titans.

In other words, I sometimes suspect that our shackles are self-forged. The language of constraints—-of choices dictated by the facts—sounds schoolmarmish to me, a tedious scolding, a droning insistence that those who refuse to restrain themselves will eventually be called to order. How dare you think that you can overstep the limits the rest of us respect? We’re watching you, taking comfort in our smug conviction that you will fail, and eager to take pleasure in your fall when it does occur. Such prim and petty reasonableness makes Nietzsche attractive. And when such reasonableness takes the form of political defeatism (the new bosses will never be any better than the current bosses so utopian thinking is “unrealistic”), we should recognize it as a self-serving rationalization of the naysayer’s own privileges.

But the adjective “schoolmarmish” jumps out. The Romantic vision is linked to hyper-masculinist codes, as well as to aristocratic disdain for bourgeois mediocrity, with its investment in security, peace, and domestic well-being among loved others. One problem of total freedom is that so often its existence is proved by self-destruction or, much worse, the destruction of others. The abolition of limits, the enactment of full-bore creativity, gets played out through suffering, the infliction of pain on bodies.16 It is as if we don’t really believe we are free, so must do the most unthinkable things in order to prove it. But I don’t trust my intuitions here. I can only note the repeated pattern of extreme freedom’s connection to suffering inflicted on self and others; the logic of the pattern escapes me. I don’t see why or how Romantic freedom would inevitably bring suffering and variants of sadomasochism in its wake, but such is often the case.

So I am returned to the issue of constraints, the discourses of reasonableness. But I think the constraints are more self-imposed than necessitated by any facts. “Self-imposed” isn’t right either. “Humanly generated” might be closer. We are in the realm of the evil humans do to humans. Let me start with the liberal principle (from J. S. Mill) of not doing harm to others by my own actions. A limit of that kind on our freedom seems right to me. And so I am now in the position of saying that limits underwritten by morality are justified. How does this acceptance of limits fit with pluralism? I think, in fact, it can, but to get from here (the acceptance of a limit) to there (pluralism) will require a few steps.

The first step is the contention that morality is entirely human. Only someone who begins with certain values and convictions that are deemed moral could ever be in a position to judge this or that new situation as one that involves moral considerations. Nothing in the situation declares it a moral one; it can only be seen as moral through the lenses of an agent who has the category “morality” and has some content attached to that category. And while it may seem that morality is a “meta-category,” I think the same holds true for lover-level categories like “cruelty.” I don’t think dogs judge whether situations are moral or not, are instances of cruelty or not. I do not think “cruelty” is a natural kind; it is a socially-generated concept, and individuals receive it as part of their initiation into a culture. That culture cannot fully control how the individual applies the concept once acquired; however, a being without the concept, or at least without the notion that events and actions can be evaluated along the lines of right and wrong, is not going to get to moral judgment just by looking at what transpires in full view. This point seems trivially true to me but, of course, as pernicious and disastrous by those who want the reality of morality to be “mind-independent.” It doesn’t assuage such folks to add that my position does not lead to “subjectivism,” because it places individual acts of judgment within a field bounded by prevailing semantic conventions. The individual lives amid the others from whom he or she first learns moral categories and this individual can no more successfully redefine robbery as morally indifferent as he or she can redefine “dog” as a large gray animal with a trunk and ivory tusks. This concession only moves “truth” from individual to communal processes of determination, and the philosopher committed to realism, objectivity, and mind-independence refuses to go there.17

In my view, only a humanly created morality can justify limits. We pass the buck, refuse to acknowledge the full extent of our Promethean powers, if we try to claim the limits are necessitated, imposed upon us, by “reality” or from some other transcendent, unalterable, superhuman location. Modern science—first physics and now biology—gives us such capacities to alter the real that the notion of “natural limits” has become just about meaningless. And the horrors of twentieth-century history have shown, in Berlin’s (1996) words, that “men of sufficient energy and ruthlessness could collect a sufficient degree of material power to transform their worlds much more radically than had been thought possible before…. Human beings and their institutions turned out to be much more malleable, far less resistant, the laws turned out to be far more elastic, than the earlier doctrinaires had taught us to believe” (9–10). Any limits are going to have to be humanly generated. It is an oddity of our scientific progress that if there are “real” limits to be found, they no longer can be plausibly located in a nature outside of us, but only in “human nature” as figured in intractable psychological and cultural dispositions. But, of course, the new genetic engineering promises to address personality traits the same way it addresses bodily diseases. Our paradoxical situation today is to use human freedom to limit human freedom. We cannot expect some non-human force to counter the conclusion that “everything is permitted.” If some things are to be forbidden, we will have to do the forbidding ourselves—and make it stick. To wait for a deus ex machina is only to insure that everything will go forward.

Am I pulling back from full-bore pluralism? Yes, to the extent that I, too, will say with the philosophers that “not anything goes.” But I am less confident that something about reality or reason or our inbuilt cognitive capacities keeps anything from going. One lesson of the twentieth century seems to be that humans are capable of doing all sorts of unthinkable acts, that nothing stops them from astounding creativity in imagining and performing actions that call forth the word “evil.” My position is that the fact of evil compels every society at some points and some places to forbid certain human actions. Both evil and the attempt to constrain it are completely human. Again, it is hard to think of animals as evil or as groups of animals devising strategies to counteract evil. Setting limits on human action and enforcing those limits is, in the original sense of the word, awful. We have this terrifying responsibility. We cannot shirk it. And we should strive to retain a sense of its awfulness. The use of social power to constrain individuals should only be countenanced by a conviction that it is necessary as a last resort. We should be ever skeptical of rules and the sanctions attached to them, returning again and again to question the necessity of even the most time-honored examples. Complacency about social power is dangerous because such power almost always overshoots its mark, ends up constraining and punishing more individuals than is necessary. The problem, of course, is that determining what is necessary is a matter of judgment and, hence, of possible different conclusions. As Wiggins (1998, 314–22) puts it, the meanings of the moral concepts by which we assess need, establish limits, and judge appropriate applications are “essentially contestable/”

This contestability, combined with the awfulness of humanly constructed power used to limit human freedom, makes pluralism in moral matters so important. I believe that we want to encourage disagreement, multiple interpretations and judgments, because we need to combat at every turn the possible ossification of moral precepts and their enforcement. Constant disputes, prompting constant re-examination of even the most basic principles, works against a complacency that loses sight of how awful it is to constrain and, worse, to punish another human being. If untrammeled freedom is linked in some mysterious way to sadomasochism, the link between moralism and a pleasure in others’ suffering is all too unmysterious. That’s why the talk of constraints in much writing on morality is so often insufferable. The pleasure taken in reigning others in is all too palpable. To put it differently: complacent and dogmatic conformism is more prevalent, I think, than dangerous amoralism. Fear of Yeats’s “blood-dimmed tide” of anarchy justified massive state-organized violence throughout the twentieth century—and the willingness of many citizens to go along with and participate in that violence. Reverence toward (or at least, sullen compliance with) received authority is much more common than total and unprincipled defiance.

Pluralism in moral matters, then, takes its stand with disagreements as salutary. They should be encouraged. Consensus in moral matters banishes an uneasiness I think we would be better off never losing. Beware the man convinced of his own righteousness. He will do harm to others with a clean conscience and firm, single-minded, purpose.

Luckily, I think the prospects for quelling disagreement are dismal, although that doesn’t stop many from trying. Take the example of a trial for murder. There are the facts of the matter. Did this person kill that person? Sometimes the facts are in dispute. But there is also another, entirely different, kind of question. Does this killing count as an instance of murder? Maybe it was manslaughter, self-defense, or an accident. The facts may be relevant to this second question, but they are under-determinative. Precedent, interpretation of intent and motive, and the understanding of what the general terms available mean will also be relevant. And, if we abandon the courtroom for a moment, we can recognize that morality often faces a third—and still different—question, namely “Was this action wrong?”

My position is that everything from facts to more murky matters of definition, assessment, and norms are potential subjects of disagreement. There are no knock-down arguments about anything that are guaranteed to convince everyone. I do think it is useful in many ways to be clear about different categories of statements, and I think the way you get such clarity is by recognizing what serves as your best evidence in cases of disagreement. I say to my son, “I didn’t know Mary’s hair was red.” He answers, “It’s not. Just look at her.” That’s where the spade turns: if my looking doesn’t do the trick, my son has no place else to go, no other evidence to bring forward. (I know of what I speak here, since my wife and I disagreed for years over whether a suitcase we owned was black or blue. We made no progress in this dispute, but also demoted “being right” to a place of minor importance. I will admit, however, that when its zipper broke and I threw it out, the resultant relief made me wonder why I hadn’t adopted that solution earlier.) But if I say, “I didn’t know Mary was Helen’s sister-in-law,” my son can’t respond, “She’s not. Just look at her.” Something else will count as our best evidence for that claim. In this case, he might say, “She’s not. Just ask her.” And even if I am unwilling to accept Mary’s self-report on the matter, we have made some progress toward understanding the terms of our disagreement. In other words, we can identify what serves as justification for a claim to “being right.” But there is always the possibility that someone will deny the cogency of that justification.

When we get to complex covering terms like “murder,” “a virtuous life” or “justice,” appeals to the facts, to looking at what is there, can never do all the work, never exhaust our reasons for making the statement that this particular situation is a case of injustice. It is useful, when disagreements arise, to be as clear as possible about the reasons we do have, because, pluralism insists, we have many different kinds of reasons, many of which are irreducible to statements of fact. To label an action “murder” is to set into motion a whole series of responding actions—arresting the person, searching for evidence, considering appropriate punishment etc. Those who argue that the label “murder” was inappropriate in this case are advocating different responsive actions. And, as in the case of Grant in the Wilderness, the facts do not determine fully in and of themselves which set of actions should be undertaken.

We reach here the perils and pleasures of pluralism. My argument is that, while agreement is always possible, disagreement is also always possible. And not only is agreement contingent, but it is also contingent whether agreement is desirable. In many cases, we cherish disagreement over agreement. It does seem easier to reach agreement about some matters than about others, but we lose much that is distinctive and valuable about morality if we try to curb its notorious proclivity for endless disputes by making it more like matters that seem to generate less disagreement. Such a strategy, I am arguing, not only underestimates the potential and actuality of disagreement in these supposedly less contentious matters, but also runs roughshod over the plurality of different kinds of arguments and evidence used to back up claims.

There is no end to dispute and disagreement. But, surely, that is an unhappy conclusion, pointing toward a world of strife and conflict. It seems impossible for there to be any successful living with others if there is constant and continual disagreement. We have to agree on some things to coexist. I think this is true. We put tremendous effort into teaching received commonalities to our children and, crucially, to getting them to agree that those commonalities are “right.” The effort reflects our awareness (on some level) that agreement is contingent and that no community can survive for very long without voluntary compliance with some set of ground-rules. Sheer coercion won’t work. The peril of pluralism is that we won’t get voluntary compliance. My argument is that voluntary compliance stems from a variety of considerations: from ties of affection to other members of the community; from fears of others’ disapproval; from appreciation of the benefits of peaceful co-existence; and from a sense of the “rightness” or “justice” of certain precepts. But voluntary compliance is just that: voluntary. Nothing in the nature of the facts, or the reasons, or the consequences guarantees compliance. There will always be people who don’t comply. And every society has to face the question of how to respond to non-compliance. What dissenting opinions and actions will it tolerate, what ones will it step in forcefully to restrain? My pluralism suggests that we be wary of designating some behavior—and even more opinions—intolerable, and that we try to keep all designations open to re-examination and re-formulation. But I don’t think universal tolerance is possible.

Voluntary compliance is so important to us not only because it’s easier, more efficient, and less violent, but also because within our tradition we value autonomy and autonomy’s off-shoots: distinctiveness, originality, creativity, and innovation. That our individual choices and lives are unscripted, that the facts, other people, and tradition do not completely dictate our responses, is something many of us value. The only thing worse than a world in which no one agreed with me about anything would be a world in which everyone agreed with me about everything. We reach something of a paradox here. I am trying to convince you of my pluralistic vision. Yet total success would be the most dismal failure. (Of course, the prospects of total success are mighty slim; that fact is one of my core reasons for being a pluralist.) Do we really want a world in which moral and other issues are not always open to disagreement and dispute? Be careful what you wish for. Are we so confident in our current formulations that we would not value the person who comes along to challenge them? More likely than not, that person is a pain in the ass, a troublemaker, a gadfly. But grudgingly, sometimes only years after the fact, we honor such people.

Pluralism is both frightening and exhilarating. Disagreement and especially disapproval terrify us, yet complete unanimity would be deadly. The pleasures of pluralism, I want to suggest, are an acquired taste. Cultivation of that taste seems to me a significant part of moral education. Cultivation of such a taste is crucial in a democratic polity.

How does pragmatist pluralism’s refusal of fixed, determinant realities connect to the views of intellectual activity and cultural politics offered in this book? Let me approach this question through another moral consideration: what does morality cover? What counts as a morally relevant situation? I assume that most of us accept that various actions are morally indifferent. Whether I eat potato or tomato soup tonight is not a moral matter. But no sooner do I say that than I begin to imagine circumstances under which such a choice might seem morally significant.

This ability to transform the situation from one that is morally indifferent to one that is morally fraught suggests that moral vocabularies are not entirely stable. Attempts to transform what labels we apply to new or familiar situations are rampant. Success of such efforts depends on what we can call, following Austin (1975), “uptake.” In other words, one of the effects of pluralism is that people are trying to convince other people of all kinds of things all the time. Not only does human action transform the world, but human interaction transforms selves. All is in motion, all is changed through these dynamic relationships. And so it occurs to me that a significant component of morality is convincing others that some situation is morally relevant. For example, in the novel Crossing the River; Caryl Phillips (1993) portrays a slave-trader through his laconic logbook—“bought a strong young man and a small girl today; refused 2 others as too sickly” (103)—and through his love letters to his wife. The effect of this juxtaposition is to suggest that the moral repugnance or probity of slave-trading never occurs to the trader. It’s just business. It takes a rhetorical effort, a discursive shift, to see slavery under the sign of morality (That discursive shift begins with the efforts of Bishop Wilberforce in the 1780s to ban the slave trade and continues through the abolitionist movements of the 1800s.)

We might say the same of eating beef. To the vast majority right now, eating beef is not a moral issue. There are people who are striving to make it morally relevant. Whether they succeed or not is under-determined by the facts of how cattle are raised and killed, and of human needs for protein, although such facts are relevant. It seems to me a matter of some interest to moral theory to consider how transformations in our understanding of the morally relevant occur. And I will admit that I hardly know how to begin to develop such an account. I’ll just remind you that the transformation moves in both directions. Homosexual acts were morally indifferent in many ancient societies, became morally significant in much of the modern West, and now many are striving to make them morally indifferent again. The nature of such acts in and of themselves cannot alone decide the case.

Let me try to be very clear here. I do not see how something in the physical nature of homosexual acts can make it “right” to declare such acts moral or immoral. In every case of judgment, I am all in favor of being as explicit and articulate as possible about the reasons I have for making the judgment I propose. But where others read the case differently, I don’t see what it adds to say “but I am right.” And, in fact, where disagreements occur, I think that saying “I am right” is a closing move. It brings discussion to an end; it marks the point where I will no longer entertain your reasons for reaching a different assessment. We do reach the end of discussion with others; but since we have to live with those others, I am suggesting that we want to be wary of such endings. I want us to try, as long as possible, to accept that the other has as good reasons for his or her beliefs as I have for mine. It is a drastic step to conclude that I am right or reasonable or moral, whereas the other is not.

Pluralism, then, makes cultural politics—the attempt to alter the vocabularies in which we understand our experiences and our world—central. But pragmatist pluralism’s affinity with “direct realism,” its attention to the recalcitrance of things and people to total determination by the cultural terms through which they are viewed, underwrites the insistence that the centrality of cultural politics should not blind us to the limits of what it can accomplish. Nothing can tell us ahead of time where and how recalcitrance will manifest itself. Our actions and our speech acts aim to alter the world and the relation in which we stand to it, but it should come as no surprise that our efforts are not always successful. When it comes to altering others and their relation to us, we occupy a primarily rhetorical scene, although we have other ways besides persuasive words to refashion others more to our liking.

The ominous tinge of this last phrase points toward the tightrope I have been walking throughout this book. I want speech acts and actions that aim to change the world, others, and myself, yet also to cultivate an appreciation, even a celebration, of the ways the world, others, and even myself, resist my best efforts. Pluralism champions resistance, the extent to which things continue to be their singular selves despite my designs and work upon them. Thus, pluralism suggests that intellectuals will find their work in the rhetorical effort to get people to change the names that they apply to situations. But it also suggests, in ways not fully compatible with that first task, that intellectuals, like teachers, will also direct their rhetorical efforts toward encouraging others to develop their own capacities as judges and to adopt a reflexive attitude toward their judgments after their production. Insofar as intellectuals can embrace this second task and cherish the rather chaotic and messy diversity of orientations and values that follow from it, they are aiding the cause of democracy. Or so I have been arguing.

Readers have complained that this formulation is too abstract, too formal. doesn’t pluralism entail any substantive commitments, any concrete courses of action? The ways I have been using democracy can seem either bloodlessly procedural or vacuously hortatory (as so often in Dewey and even more often in Walt Whitman). I have tried to suggest in chapter 2 some of the ways in which a classroom can model a democratic public sphere and in chapter 4 have considered the need for state action to preserve and foster such public spaces. One of the great frustrations of current attempts to reform voting procedures, to reduce the influence of wealthy contributors in our political process, and to combat the accumulation of the media into a very few corporate hands is that a “general welfare” interest in democracy per se is not a recognized legal ground in constitutional law (at least as currently interpreted by the prevailing majority on the Supreme Court). We cannot get the kinds of institutional structures that promote democratic interactions of the type I have been advocating if we have to argue on the basis of “individual rights.” The use of the First Amendment right to free speech to stymie campaign finance reform is only the most dramatic case in point. That a practice is not democratic (a contestable point in each case, to be sure) is no argument against it. So advocates of democracy have their work cut out for them: very fundamental transformations of the United States’ political institutions are called for if democracy is to flourish. Aiming for such transformations goes hand-in-glove with, but is recognizably a different enterprise than, aiming to transform American political culture (broadly construed.) The available venues for public deliberation, the quality of the interactions in those venues, and the limit on those able to participate in those interactions all leave much to be desired, much to be reformed.

However, as Eve Sedgwick (1990) points out, an “emphasis on the performative relations of… conflicted definition” (of a term like democracy as much as of the terms—homosexual, gay, queer—highlighted in her work) suggests “a practical politics” of “multi-pronged movement… without any high premium placed on ideological rationalization” among the various actions taken. “The cost in ideological rigor, though high indeed, is very simply inevitable,” she insists. “[T]his is not a conceptual landscape in which ideological rigor across levels, across constituencies is at all possible, be it ever so desirable” (13). In short, even where political agents can mobilize groups and win battles through invocation of the term “democracy,” we should not expect immediate or even eventual consonance with other uses of that term in political struggles. The accumulated weight and legitimacy of the term “democracy” makes it worth invoking by all sides in many contests. Pluralism leads us to expect many contests and many invocations. Resolutions will ideally result from felicitous performances that secure “uptake,” will pragmatically result from decision procedures (like voting) that bring acceptable closure in the absence of consensus, and will all too often result from the more powerful contestants taking matters into their own hands. All resolutions, however achieved, will be temporary. And we will need norms of democratic procedures and ideals of full participation to challenge the premature and unequal closures wrought by those with power. Tl^e only response to a resolution one abhors as unjust, illegitimate, or “wrong” is to contest it, with the choice of the means for such contestation a fateful one. Nothing external to the contest will save us from it—or secure the issue of it. But that does not mean we should underestimate the efficacy of principles, ideals, and norms as resources in the contest.

In the universities where intellectuals mainly reside, pluralism suggests that individualistic models of scholarly work (especially prevalent in the humanities) are misguided. Interdisciplinarity should not mean one individual mastering several discourses of inquiry, but teams of scholars working together on broadly defined topic areas from different perspectives. Collaborative work should not be aimed at overcoming the deficiencies of each individual contributor (although it can and will have that effect in some cases), but at recognizing the plurality of ways that a topic can be approached and understood. We should not expect some holistic synthesis to emerge from such collaborations (although we needn’t reject such syntheses if they occur), since the revelation of differences in results and the beliefs they engender can be as illuminating as convergence. Current modes of working foster not only ignorance of others’ work, but a defensive contempt of approaches that differ from one’s own. Doubtless, building this kind of intellectual community on campuses will reduce productivity as measured by numbers of articles and books published. Creation of functioning public spheres on campuses places various local amenities and interactions on a higher level vis-à-vis the more abstracted interaction with the scholar’s national professional cohort than has been the general rule over the past forty years (at least). A sea-change in academic culture, in the priorities and interests of the academics themselves, would be required. But that change can look impossible to accomplish if we see the chore as transforming the culture tout court and in one fell swoop. Rather, in the pragmatist experimental mode, we should be creating local working groups, trying out ways of doing our work differently and more collaboratively, seeing if we can address and reach different audiences than the ones we have habitually written for. From the actual doing will follow the changes in attitudes, purposes, and goals.

I will mention one such experiment in Chapel Hill, sponsored by UNC’s Institute for the Arts and Humanities.18 Faculty members are paired with a member of the community who is working on a project in the arts or in community organizing. The program gives the community member access to the university’s resources and to the advice and feedback of the faculty member. It also provides the faculty member with an “in” into the world beyond the campus walls and the need to convert his or her specialized knowledge into something of use for the nonspecialist. In addition, the whole group of ten pairs meets three times a year to discuss what each pair is doing—and various participants have found these cross-pollinating gatherings the most valuable part of the whole experience. In short, Dewey’s assessment in The Public and Its Problems of democracy’s need for vital public spaces remains as true now as it was in 1927 when he wrote it. In our classrooms, but also beyond it on our campuses, academic intellectuals have more opportunities to create such spaces than just about anyone else in American society.

The temptation is to offer a final summary that pulls the various points I have made together under the covering term of pluralism. However, not only would such a conclusion test your patience and insult your intelligence, but it would also violate the spirit of pluralism, which finds the world a messy and complex place, in which all things do not hang together. We have competing demands upon us, must choose among conflicting goods, and creatively chart a course for ourselves, knowing that, for better and for worse, not everyone will approve of our decisions, judgments, and actions; and that the facts of the matter will not make one course of action obviously better than another even as they do limit what is possible. Good luck.


1. Rescher (1995) provides a very useful overview of the issues involved in taking a pluralist position, although I ultimately disagree with his insistence on a single world and single truth of which there are multiple versions. McLennan7s (1995) introductory volume is also superb; it is directed more toward issues in the social sciences, while Rescher attends to more purely philosophical debates.

2. McLennan (1995, chap. 1) offers a good, quick overview of liberal pluralism and considers the extent to which the various postmodern pluralisms (which usually vehemently deny any kinship with liberal pluralism) retain certain liberal themes of the 1950s. McGowan (1991) argues that poststructuralism often resembles the liberalism it claims to abhor, most particularly in remaining attached to exactly the kind of negative liberty that troubles me in Berlin’s work. Berlin certainly embodies the kind of “diffident liberalism” that I explore in chapter 4 of this book.

3. Spinoza’s concept, conatus, does some of the work I am trying to gesture toward here. Conatus is “a thing’s endeavour to persist in being,” the pressure it exerts back outwards toward the world (Lloyd 1996,9). For Gilles Deleuze, conatus becomes connected to what he calls Spinoza’s “expressionism.” “Our conatus is thus always identical with our power of acting itself. The variations of conatus as it is determined by this or that affection are the dynamic variations of our power of action (1990, 231).” In this interpretation, conatus names that primal something out of which we act toward the world. Unless we posit some such energy or power in the self, we risk seeing the self as an utterly passive recipient of the world’s imprint. Charles Altieri (1994) also relies on conatus to name “the force driving our investments” (24) and connects it to the concept of “style,” which marks each self’s distinctive ways of manifesting that “force” (see 85-87). Aaron Pollack and Jorg Schaub, along with Altieri, share the blame for my invocation of Spinoza.

4. Arguing against philosophy’s obsession with “knowledge” and in favor of a focus on “experience,” Dewey writes: “[Experience is not identical with brain action; it is the entire organic agent-patient in all its interaction with the environment, natural and social. The brain is primarily an organ of a certain kind of behavior, not of knowing the world. And to repeat what has already been said, experiencing is just certain modes of interaction, or correlation, of natural objects among which the organism happens, so to say, to be one. It follows with equal force that experience means primarily not knowledge, but ways of doing and suffering. Knowing must be described by discovering what particular mode—qualitatively unique—of doing and suffering it is. As it is [i.e., in the philosophical tradition Dewey is trying to overcome!, we find experience assimilated to a non-empirical concept of knowledge, derived from an antecedent notion of a spectator outside of the world” (1981, 26).

5. This is hardly the place to take up the philosophical debate between realists and antirealists—a debate that has, I think, hardened into a ritual so prescripted that it has long ceased being productive. But I will note that I am with Hilary Putnam in believing that pragmatist pluralism is compatible with a “direct realism” that credits the commonsense experience of living in a world of material things, persons, social institutions, bodily sensations, and any number of other entities encountered in our daily rounds. One key is that these experiences are unproblematic until something causes us to “doubt” our usual ways of responding to all that surrounds us. As Charles Peirce (1992) insisted, much “philosophical doubt” is singularly unreal; it does not arise out of thwarted interactions with situations and their components. And where “doubt” does occur, action (or “inquiry,” but inquiry always understood as action upon the world) follows. Such action aims to readjust our relation to circumstances, so our situation is improved. Another key is that the solidity of these material things is only one relevant consideration among others that influence our judgments and actions. And, finally, as Putnam puts it, comes “the denial that reality dictates one unique description” (1998, 45). Or, as Feyerabend puts it: “The material humans… face must be approached in the right ways. It offers resistance; some constructions … find no point of attack in it and simply collapse. On the other hand, this material is more pliable than is commonly assumed” (1999,145). The reader wanting to begin to explore this pragmatist realism should see Putnam (1998), Peirce (1992), Dewey (1981), and Feyerabend (1999,131-60).

6. Foote (1974) writes: “‘Most of us thought it was another Chancellorsville,’ a Massachusetts infantryman would remember, while a Pennsylvania cavalryman recorded that his comrades used a homelier term to describe the predicted movement. They called it ‘another skedaddle.’
”If the Chancellorsville parallel was obvious—both battles had been waged in the same thicket, so to speak, between the same two armies, at the same point of year, and against the same Confederate commander—it was also… disturbingly apt. By every tactical standard, although the earlier contest was often held up as a model of Federal ineptitude, the second was even worse-fought than the first. Hooker had only one flank turned: Grant had both. … In plain fact, up to the point of obliging Grant to throw in the sponge and pull back across the river, Lee had never beaten an adversary so soundly as he had beaten this one in the course of the past two days.
“What it all boiled down to was that Grant was whipped, and soundly whipped, if he would only admit it by retreating: which in turn is only a way of saying that he had not been whipped at all. ‘Whatever happens, there will be no turning back,’ he had said, and he would hold to that” (1974,188-89, my emphasis).

7. I am probably using Wiggins’s concept in ways he would deplore. But I highly recommend the work of this moral and political pluralist to literary critics, whose antipathy to Anglo-American philosophy usually means they have never heard of Wiggins, no less read his important and consistently enlightening work.

8. In Foote’s (1974,291-96) account, Grant does not acknowledge defeat at Cold Harbor early enough and thus loses the confidence of his troops, who consistently refuse to obey orders to attack entrenched enemy troops for the rest of 1864. So Grant’s failure at Cold Harbor is not solely, or even primarily, reading the facts “wrong,” but not getting his army to ratify his interpretation of events. In Austin’s terms (1975), Grant’s speech act is “infelicitous,” because it does not have sufficient “force” to garner his audience’s agreement or “uptake.”

9. See Menand (1998) for a thorough and fascinating—albeit skeptical—account of how James escaped depression and how that escape figures in standard versions of James’s life and work.

10. See Ricoeur (1986) and Eagleton (1991) for two useful overviews of ideology theories. Althusser’s (1971) highly influential account of ideology is also an “error theory,” since it claims that “individuals who live in ideology” inhabit “a determinate (religious, ethical, etc.) representation of the world whose imaginary distortion depends on their imaginary relation to the conditions of existence, in other words, in the last instance, to the relations of production and to class relations (ideology = an imaginary relation to real relations)” (166-67).

11. Critics who complain that pluralism is naïve and over-optimistic usually insist that there is an underlying set of enforced, systematic relations that belie the plurality of options that pluralism indicates. Guillory’s (1993, chap. 5) critique of Smith (1988) takes exactly this position, while for Eagleton (2000) pluralism is the ideology of capitalism since it celebrates a diversity that is properly understood as the product of a capitalism that ruthlessly divides to conquer and segregates economic winners from losers. “The predatory actions of capitalism breed, by way of defensive reaction, a multitude of closed cultures, which the pluralist ideology of capitalism can then celebrate as a rich diversity of life-forms” (129-30). As Gibson-Graham (1996) argue, granting capitalism such monolithic identity (everywhere the same) and such omnipotence is hardly plausible given the variety of economic forms in the world that result from the interactions between economic and other (social, cultural, religious, and natural) factors.

12. Judgment is a kind of metaphor. Aristotle (1996) defines metaphor as “the application of a noun which properly applies to something else” (34) and tells us that “the most important thing [for the poetl to be good at is using metaphor. This is the one thing that cannot be learned from someone else, and is a sign of natural talent; for the successful use of metaphor is a matter of perceiving similarities” (37). Without attaching too much weight to “proper,” we can say that judgment uses a name that was applied in the past and now transfers its application to this situation, thing, event, emotion, etc. in the present. Acceptance that we must use the available stock of words to describe the novel present can be contrasted to an Adamic or Orphic notion of names that capture the essential truth of the thing named. See Aarleff (1982) and Bruns (1974) for discussions of the persistent dream of a language that would speak the world as it is in itself as opposed to a language that uses human terms for nonhuman realities.

13. In Book VI, chap. liv of George Eliot’s Middlemarch (1997; originally 1872), Mrs. Cadawallader tells Dorothea Brooke, “We all have to exert ourselves to keep a little sane, and call things by the same name as other people call them by.” To which Dorothea retorts, “I have never called anything by the same name that all the people about me did.”

14. McGowan (1991) includes an extended critique of Nietzschean models of freedom and argues that individual actions are only meaningful within a context of relations to world and others. I still hold that position, even as I consider the appeal of the Promethean here.

15. Of course, Romanticism as a lived reality was always a minority avocation. What the majority seems to like is the voyeuristic thrill of watching Romantics like Byron or Wilde trace out the pattern of forbidden pleasures leading to dramatic falls. This same relation holds in the public’s current fascination with the fabulous wealth, beauty, and self-indulgence of celebrities joined to that same public’s satisfaction with the failed marriages, spectacular bankruptcies, and various drug addictions of those same celebrities. For a wonderful account of how the Romantic ideal lives on among rock musicians, see Marcus (1989).

16. Scarry (1985) is the fullest attempt to trace out this connection between creativity and inflicting bodily pain.

17. One such philosopher is Wiggins, who goes down three-quarters of the road toward a social understanding of truth, but pulls up at the last mile to insist that “objectivity is not mere intersubjectivity …. Agreement [among members of a speech community] plays its role in fixing senses. We only have a chance of getting to the point where a predicate has a clear public sense if the users of the language are so constituted as to be able to come to agree sufficiently over a sufficiently large area whether the predicate applies or not; and what senses we invest our language with plays its part in fixing what truths we shall be able to give expression to. But that exhausts the role of agreement—just as the size and mesh of a fisherman’s net determines what fish he will catch, if he catches any; not what fish are in the sea” (1998, 249-50). But “cruelty” is not the same as “fish.” The extension of “cruelty” is not always self-evident. Even where you and I agree that this man murdered that other man, we can disagree whether his execution by the state is “cruel,” while the question of the “cruelty” of capital punishment may never even arise in whole societies. What could be read off the “facts” of the case that would provide determinate criteria for judging it “cruel”? Here the prior existence of the concept (as shaped through communal speech practices) does seem required for even the apprehension of its possibly applying to this event. But Wiggins is right to remind us that “cruelty” is used to highlight discernible features of the action.

18. Ruel Tyson, the Institute’s director, invented this “Public Fellows” program, which I currently administer in my position as the Institute’s Associate Director. Thanks must also go to our donors, Robert Hackney and Shauna Holiman, whose generosity makes it possible.

Additional Information

ISBN
9781501720963
Related ISBN
9780801487668
MARC Record
OCLC
1031885542
Pages
201-229
Launched on MUSE
2018-01-15
Language
English
Open Access
Yes
Creative Commons
CC-BY-NC-ND
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.