In lieu of an abstract, here is a brief excerpt of the content:

7 / Conclusion Summary In this book, I have argued that statistical objectification works to maintain the hegemony of the high-stakes testing system in Texas. In one sense, statistics objectify Texas students, teachers, and the public, inscribing them as objects of governance. I came to this conclusion by using Abu-Lughod’s (1990) suggestion of viewing resistance as diagnostic of power, seeing the forms of resistance against testing as resistance against being transformed into a statistic, as one student put it, as “a name and a score.” I found that Texas students, their parents, their teachers , and others were engaging in resistance against what Foucault (1983) called the “submission of subjectivity.” Statistics on testing also generate what Woodward (1999) called “statistical panic.” This structure of feeling constructed by the Texas Education Agency and the media enforced widespread test anxiety across Texas, which both imparted fear on children and their parents, teachers, and administrators and served as a political rallying point for the movement for multiple criteria. I argue that the panic generated from statistics on testing allowed for teachers and administrators to target students of color as at risk of failure and render them invisible through the policy of pushing out. Statistical discourse on testing not only objectifies students as things, but provides the conditions for the commodification of knowledge , as testing corporations, such as NCS Pearson, profit in the millions from testing. By incorporating students, teachers, administrators, state agency workers, and even school communities within a system of Conclusion / 141 competition, the state of Texas has not only shielded the testing system from criticism, but also created a rewards-sanctions system that only supports the profitability of testing. I claim that statistics are key to the profitability of testing because within the postmodern informational economy, statistics provide a means for commodifying (objectifying, in a Marxist sense) social facts. The profitability of testing and dataprocessing companies is accompanied by the neoliberal imperative to privatize public functions, that is, redistributing funds directed toward public services to private companies. While statistics historically became integral to the government with the development of the welfare state, neoliberals have recuperated statistical discourses that oppose social welfare, specifically Malthusianism and eugenic meritocracy, while emphasizing the need for economic efficiency through the statistical discourse of quality control. As Castel (1991) suggested, preventive policies tend to dissolve subjectivity by reducing individuals to statistical factors and transforming intervening specialists, such as teachers, to mere executants, while overemphasizing the role of administrator and creating opportunities, as Apple suggested, for the managerial middle class. Through this recuperation, neoliberal discourses delegitimize public schools as social welfare by reducing education to statistical factors, claiming to prove both the inefficiency of equitable district funding and the inevitability of “minority failure” (see McDermott 1997). An additional component to the delegitimizing of public education is the gendered devaluing of teaching, an occupation of predominantly women, as too subjective. Along with objectifying subjectivities and labor, statistical discourses objectify truth through a hegemonic struggle (over the production of truth). Statistical materialism has historically been a key tool in establishing hegemony, by representing the collective, or “national-popular,” will and functioning as a popular religion or way of viewing the world. Statistical discourses have also been central in educating consent, or, as Woolf (1989) suggested, “affirming consensus” and in constituting a tool of moral self-government and self-identification. In terms of testing , statistics, which Asad (1994) suggested is the modern language and politics of progress, are central to the construction of the concept of minority failure. The statistical tools of representativeness via polls and sampling were central in constructing the idea of a collective will that the TPERF claimed could be summarized as “Texans are saying, ‘Don’t mess with testing.’” For the TEA, the statistical tool of standard error of measurement was central in stabilizing the testing system in [18.117.196.184] Project MUSE (2024-04-24 12:36 GMT) 142 \ Chapter 7 Texas, particularly guarding against challenges to the legitimacy and validity of both the new TAKS test and the accountability system empowered the same year the TAKS was introduced to keep third graders from being promoted to the next grade. The validity of the testing system also depended on the statistical tool of correlation, which functions as ideological glue in that it constructs relationships between quantifiable entities, as well as commonsensically suggests a relationship of cause. The hegemony of statistical materialism can also be attributed to statistical subjectivity, or the use of statistics as...

Share