In lieu of an abstract, here is a brief excerpt of the content:

6 Agential Systems, Causal Deviance, and Reliability Jesús H. Aguilar According to the causal theory of action (CTA) an action is an event caused by a mental state or event that rationalizes its execution. Actions are typically exemplified by bodily movements, and their internal causes by intentions. The CTA has traditionally been saddled with problems emerging from so-called deviant causal chains, namely, chains of events that satisfy the CTA’s conditions for the production of an action but whose product is intuitively not an action. A plausible strategy for defending this theory against the possibility of deviant causal chains is grounded on the proposal that the bodily movement corresponding to an action must be sensitive to the content of the mental state that causes it.1 Let us call this the sensitivity condition. Sensitivity here is understood as the specific responsiveness that a bodily movement can have to the particular content of a motivating mental state. However, this strategy faces a serious challenge from cases where bodily movements are produced by means of causal chains of events that involve intermediate actions performed by another agent. These cases are quite challenging because they apparently generate deviant causal chains that satisfy the sensitivity condition. Consider Christopher Peacocke’s (1979b) example of a neurophysiologist who reads an action-triggering intention directly from a subject’s brain and then stimulates the corresponding efferent nerves. The stimulation in turn causes a bodily movement that matches the subject’s intention. According to Peacocke, although such bodily movement is transitively caused by an intention that rationalizes its execution, and although such bodily movement is sensitive to the specific content of this intention, it is not one of the subject’s actions. The reason such bodily movement fails to count as an action is that its production contravenes a fundamental assumption concerning the nature of action and agency, namely, that the agent must be the originator of her own actions.2 Given the strange causal 86 J. H. Aguilar path involved in these type of scenarios, they are usually considered deviant and presented as counterexamples to the CTA. If indeed such scenarios involving what we may call “prosthetic agents” generate causal deviance, we are facing a serious challenge to the CTA’s best effort to avoid causal deviance based on the appeal to sensitivity. Oddly enough, if one believes that Peacocke’s neurophysiologist scenario is deviant, then one also needs to explain in what sense this particular case is different from other scenarios involving prosthetic agents that share with it all their salient features without themselves being obviously problematic . Consider a similar fictional scenario where an assistant’s job is simply to hold some wires that connect the efferent nerves of a subject, allowing the production of the subject’s bodily movements.3 This case of prosthetic agency is analogous to the neurophysiologist’s since in both cases there is an intermediate agent who intervenes in the causal chain that ends with the subject’s bodily movement. And yet, in contrast to the neurophysiologist’s case, it is hard to say that the subject’s ensuing bodily movement is not an action performed by the subject. If anything, the causal role of the assistant is very similar to the causal role played by the subject’s own efferent nerves, namely, that of a causal bridge connecting the subject’s intentions to the subject’s bodily movements. Peacocke’s answer to the challenge arising from deviant cases involving prosthetic agents consists in requiring that intentional behavior explicitly exclude the possibility of two agents interacting in this prosthetic way. He believes that by stipulating that a causal chain leading to intentional behavior “should not run through the intentions of another person” (Peacocke 1979b, 88), one can safely rely on the sensitivity condition to take care of deviance. If by definition no cases involving two agents interacting in this way are possible, then the puzzling resemblance between the cases of the neurophysiologist and the assistant is not an issue. But this is hardly a satisfactory reply if one is a supporter of the CTA. Not only is it an arbitrary exclusion of cases that are clearly conceivable like the assistant’s, but this reply is also in tension with the CTA’s general approach to action. In particular, Peacocke’s stipulation is in tension with the CTA’s acceptance of transitive causal relations involving actions as much as wires and nerves.4 Moreover, if there is a promising line...

Share