In lieu of an abstract, here is a brief excerpt of the content:

Brookings Papers on Education Policy 2001 (2001) 218-223



[Access article in PDF]

Comment by Robert H. Meyer

[Searching for Indirect Evidence for the Effects of Statewide Reforms]
[Tables]

David Grissmer and Ann Flanagan sought to test indirectly for the effects of statewide reform efforts on student achievement by analyzing the state National Assessment of Educational Progress (NAEP) tests in mathematics at the fourth- and eighth-grade levels from 1990 to 1996. The authors assume that statewide reform efforts begun in the mid-1980s will have stimulated growth in student achievement during the 1990-96 period if the reforms were effective. The authors produce estimates of rates of growth in mathematics achievement during this period for individual states and the nation as a whole, controlling statistically for differences across states and over time in the demographic mix of students. Grissmer and Flanagan find that, after controlling for differences over time in the demographic mix of students, national average test scores in the fourth and eighth grades increased during 1990-96.

In this comment I discuss how this evidence should be interpreted. In particular, I examine whether this evidence indicates that the productivity of American education has increased over the last decade.56 I conclude the following. First, the information contained in the state and national NAEP data for 1990-96 is too limited to support definitive conclusions about the productivity of American education. The major weakness of the data is that they do not contain information on the achievement growth of multiple cohorts of students. As a result, analyzing the data using conventional value-added models of student achievement and school productivity is not possible. I draw on NAEP data for the period 1973-86 to show how more extensive data can be used to conduct such analyses. Second, I show that despite the limitations of the NAEP data for 1990-96, the data can be analyzed or interpreted within the value-added framework. My analysis supports the conclusion that national productivity growth in grades five through eight most likely increased during the late 1980s and declined during the mid-1990s. [End Page 218]

Before turning to the weaknesses of the state and national NAEP data for 1990-96, I should note that the NAEP data do have some major strengths. First, the NAEP is regarded as a high-quality assessment that covers a broad domain of mathematics skills. Second, the NAEP is not a high-stakes examination, and thus it cannot be argued that improvements in NAEP test scores are the product of narrow teaching to the test.57

The basic problem with the state NAEP data (for the period 1990-96) is that they do not permit tracking of achievement growth of different cohorts of students. Information on achievement growth is currently available for only a single cohort of students, those who were fourth graders in 1992 and eighth graders in 1996. As a result, using conventional value-added models of achievement growth to estimate the effectiveness of state educational policies with these data is not possible.58 Instead, the authors estimate the rate of growth in fourth- and eighth-grade achievement (controlling for differences across states and over time in the demographic mix of students). In previous research I have demonstrated that changes in the level of student achievement often provide a misleading picture of changes in the efficacy of schools and school policies.59

What are some of the problems that arise from not having data on achievement growth for different cohorts? One major problem is that researchers cannot control for what students know when they enter school in kindergarten or first grade. As a result, changes in the average level of student achievement could reflect differences that existed before students start traditional schooling. For example, increases in eighth-grade achievement could reflect improvements in the quality of preschool education that occurred a decade earlier. In the absence of data on student achievement measured soon after students begin their schooling, it is difficult, if not impossible, to disentangle the contributions of schooling and other resources received before and after the beginning of conventional schooling.

A second...

pdf

Share