In lieu of an abstract, here is a brief excerpt of the content:

  • Touch in the Abstract
  • Aden Evens (bio)

One expresses oneself at the computer almost exclusively through the mouse and keyboard. Vision is nearly indispensable, and hearing plays a supporting role, but these senses are unusually constrained at the computer, as active input falls to the fingertips. At the computer, you express yourself, communicate your desires, by executing a gesture chosen from among a very few possibilities: you can click a key on the keyboard, move the mouse, or click the mouse. That’s it. Specialized speech-based interface augmentations are available; there are eye motion detectors and other alternative mouse-control techniques; touchscreens are proliferating in certain categories of device. But the great bulk of us continue to use the mouse and keyboard as our primary and even our only means of communicating desire to the computer. It is remarkable that so much desire gets expressed, such a breadth of different ideas passes through this restricted interface, fingertips against plastic. The interface evidently assigns the sense of touch a particular prominence in the expression of desire. What kind of desire gets expressed via touch, and what kind of touch touches the interface?

The interface appears particularly narrow when it is recognized as a sequence of elementary commands: the expression of desire breaks down into individual keypresses and mouse clicks. Atomized, self-identical, and absolute, each keypress or mouse click is an abstraction; the k key generates not a particular, concrete k but only the selfsame, abstract command, k. (In general, interface components transduce between the material and the abstract. This is an appropriately vague definition of the interface: hardware and software that mediates between digital and actual or between abstract and material.) The job of mouse and keyboard is to render materiality as abstraction, just as the monitor and speakers take the abstraction of the digital code and present it to material sensation. A single click, the paradigmatic action at the interface, might involve hundreds of muscles, a finely tuned machine of flesh and nerve, plastic and silicon. But in the end, the mouse button is either pressed or not at any given instant of mouse clock time; nothing else matters at the computer. The materiality of the press becomes the abstract binarity of 0 or 1.

The case of mouse movement is only slightly more complex: The now-standard optical mouse photographs images of the surface passing [End Page 67] underneath it. A differential analysis of successive images yields the mouse’s vector of motion. This differential mathematics is built into the mouse itself, housed on a chip designed specifically to perform these calculations. Motion (or stillness) of the mouse generates a pair of numbers, a horizontal velocity and a vertical velocity. Fifteen hundred times per second the mouse reports these numbers to the computer, passing them along through the USB bus (or over Bluetooth or whichever hardware standard is in use). The mouse cares not one bit about any other elements of touch. How hard you push, whether you’re sweaty, touch type or hunt-and-peck, the interface does not so much ignore as exclude these facets of touch, rendering touch abstract even before the CPU gets hold of the command.

Which is to say, even the materiality of the mouse is coded by the binary; it’s as though the digital extends its effect by reaching out of the computer to determine the characteristics of its surroundings. The modern mouse takes photographs of the surface passing underneath it: an eighteen-by-eighteen pixilated grid, a snapshot taken so many times per second. Older opto-mechanical mice feature a mouse ball connected by gears to wheels inside the mouse, flat plates that turn along with the spinning of the mouse ball, one for forward-backward and one for left-right motion. The wheels have evenly spaced holes around their perimeters, admitting and interrupting a beam of light shining onto a sensor. The sensor reads a count per second of these interruptions to calculate the movement vector. (It must also include some mechanism for determining the direction of motion of each wheel, since the number of interruptions of the light beam indicates how rapidly but not in which...

pdf

Share