- The Contemplative Classroom, or Learning by Heart in the Age of Google
In his provocative essay “Slow Knowledge,” David Orr outlines the countervailing assumptions of what he calls “the culture of fast knowledge.” Among these are the widely shared, though rarely examined, beliefs that “only that which can be measured is true knowledge; the more knowledge we have, the better; there are no significant distinctions between information and knowledge; and wisdom is an undefinable, hence unimportant category.”1 If all this were true, it would follow that computers are fast overtaking humans as the next intelligent species. Or, to put it differently, the two species have been colluding for some time to produce smarter machines and dumber people, as we humans abdicate more and more of our mental tasks. Moreover, when it comes time to weigh values—to ask not how quickly or efficiently some task can be done, but whether it ought to be done at all—we are strangely disinclined to challenge digital fatalism, which has become the default logic of late capitalism. Whenever a new digital option appears, we assume that if it can be done and someone somewhere is doing it, then it should be done and we ought to do it too. So even the local hardware store has to be on Facebook so customers can “like” it, and the AAR needs a Twitter account to send weekly tweets. We seldom pause to ask questions about means and ends, unintended consequences, or the sheer mindless clutter of our lives—let alone the implications of this or that new app for our descendants down to the seventh generation, or the enlightenment of all sentient beings. Surface obliterates depth; instant stimulation trumps mature reflection; short-term profit overrules the long-range good.
Computers themselves are of course morally neutral. Nonetheless, it is useful to recall why contemplation is the antithesis of the fast knowledge they promote. Contemplative practice is grounded in such values as presence, mindfulness, inwardness, and the integration of mind and body. The last point is worth stressing, for all knowledge, even the most spiritual, is embodied; we are still biological beings, not cyborgs. Though contemplatives of the past have often been fierce ascetics, even the most emaciated hermit could scarcely despise the body to the degree that we find in [End Page 3] some contemporary Internet addicts. Consider the following, reported in July 2012 in Newsweek:
When the new DSM [Diagnostic and Statistical Manual of Mental Disorders] is released next year, Internet Addiction Disorder will be included for the first time, albeit in an appendix tagged for “further study.” China, Taiwan, and Korea recently accepted the diagnosis, and began treating problematic Web use as a grave national health crisis. In those countries, . . . the story is sensational front-page news. One young couple neglected its infant to death while nourishing a virtual baby online. A young man fatally bludgeoned his mother for suggesting he log off (and then used her credit card to rack up more hours). At least 10 ultra-Web users, serviced by one-click noodle delivery, have died of blood clots from sitting too long.2
Even short of such extremes, numerous studies have linked excessive Internet use in the United States with increased rates of depression, anxiety, suicidal thoughts, and obsessive-compulsive behavior. Teens admit to being exhausted by the constant need to update their Facebook status and answer countless texts, yet they are afraid to log off for fear that some excitement may pass them by. There is even a new acronym, FOMO, for the fear of missing out. If a central aim of contemplative practice is to dismantle and disengage from the “false self,” there is no better tool for constructing false selves than social media. In her book Alone Together, the social psychologist Sherry Turkle quotes a young man who told her, “What I learned in high school was profiles, profiles, profiles, how to make a me.”3 Another student said he maintained four avatars onscreen at all times, along with his email, video games, and—yes—coursework. His “real life,” he said, was “just one more window, . . . not usually my best one.”4