In lieu of an abstract, here is a brief excerpt of the content:

This article is about an experiment I conducted for publication in a volume collecting the papers read at the Sixteenth Annual Alabama Symposium on English and American Literature: “Literacy Online: the Promise (and Peril) of Reading and Writing with Computers,” October 26–28, 1989 (organized by Myron Tuman). My talk at the conference placed the current developments in Artificial Intelligence and hypermedia programs in the context of the concept of the “apparatus,” used in cinema studies to mount a critique of cinema as an institution, as a social “machine” that is as much ideological as it is technological. The same drive of realism that led in cinema to the “invisible style” of Hollywood narrative films, and to the occultation of the production process in favor of a consumption of the product as if it were “natural,” is at work again in computing. Articles published in computer magazines declare that “the ultimate goal of computer technology is to make the computer disappear, that the technology should be so transparent, so invisible to the user, that for practical purposes the computer does not exist. In its perfect form, the computer and its application stand outside data content so that the user may be completely absorbed in the subject matter—it allows a person to interact with the computer just as if the computer were itself human” (Macuser, March, 1989). It was clear that the efforts of critique to expose the oppressive effects of “the suture” in cinema (the effect binding the spectator to the illusion of a complete reality) had made no impression on the computer industry, whose professionals (including many academics) are in the process of designing “seamless” information environments for hypermedia applications. The “twin peaks” of American ideology—realism and individualism—are built into the computing machine (the computer as institution).

The very concept of the “apparatus” indicates that ideology is a necessary, irreducible component of any “machine.” Left critique and cognitive science agree on this point, as may be seen in Jeremy Campbell’s summary of the current state of research in artificial intelligence: A curious feature of a mind that uses Baker Street [Holmes] reasoning to create elaborate scenarios out of incomplete data is that its most deplorable biases often arise in a natural way out of the very same processes that produce the workmanlike, all-purpose, commonsense intelligence that is the Holy Grail of computer scientists who try to model human rationality. A completely open mind would be unintelligent. It could be argued that stereotypes are not ignorance structures at all, but knowledge structures. From this point of view, stereotypes cannot be understood chiefly in terms of attitudes and motives, or emotions like fear and jealousy. They are devices for predicting other people’s behavior. One result of the revival of the connectionist models in the new class of artificial intelligence machines is to downgrade the importance of logic and upgrade the role of knowledge, and of memory, which is the vehicle of knowledge (Campbell, The Improbable Machine. New York, 1989: 35, 151, 158).

Critique and cognitive science hold different attitudes to the inherence of stereotypes in knowledge, of course. Critique is right to condemn the acceptance of or reconciliation with the given assumptions implicit in cognitive science, but its own response to the problem, relying on the enlightenment model of absolute separation between episteme and doxa, knowledge and opinion, is too limited. This split is replicated in the institutionalization of critique in academic print publication resulting in a specialized commentary separated from practice. Postmodern Culture could play a role in exploring alternatives to the current state of the apparatus. Grammatology provides one possible theoretical frame for this research, being free of the absolute commitment to the book apparatus (ideology of the humanist subject and writing practices, as well as print technology) that constrains research conducted within the frame of critique. The challenge of grammatology, against all technological determinism, is to accept responsibility for inventing the practices for institutionalizing electronic technologies. We may accept the values of critique (critical analysis motivated by the grand metanarrative of emancipation) without reifying one particular model of “critical thinking.” But what are the alternatives? The experiment I...

Additional Information

Launched on MUSE
Open Access


Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.