- Response to David L. Richards
We appreciate the opportunity to respond to David Richards’ critique of some of the arguments we made in “Information Effects and Human Rights Data.”1 Before doing so, we want to express appreciation to David Richards and David Cingranelli and their colleagues for their longstanding work in developing and maintaining the CIRI dataset.2 Our article was offered in the spirit of constructive dialogue, and we aspire to continuing the dialogue in these pages.
In responding to the specifics of Richards’s discussion of our piece, we confine ourselves to addressing what we see as his three main points. First, he takes on the question of how word counts are related to coding. Second, based on mean CIRI scores, he argues that there is little indication of a time trend to be seen in the data. Finally, he questions the use of Latin American cases as illustrations of how coding does or does not respond to changing human rights conditions.
Much of Richards’s critique concentrates on rebutting what we see as a relatively minor component of our overall analysis: the section that examines word counts of the annual US State Department reports, on which CIRI’s coding is primarily based. In the original article, we made a much more complex argument than Richards suggests. We discuss changing standards [End Page 493] of accountability, information effects, political bias, and ceiling effects. We argued that Amnesty International and the US State Department have been producing their reports in an increasingly rich information environment over time. Our argument was not primarily about the length of their reports, but about the changing context within which they produced their reports, both in terms of the amount of information available and the interpretations of that information. Because we could not easily model this information environment, we used word counts as one proxy for information. We reported a correlation between word counts and coded intensity of human rights abuses for all but the US State Department reports, which as we noted, have expanded over time with Congressional and executive mandates. However, we cautioned that this was not conclusive, although we wanted to report the results as a “first cut.” To suggest that our main point was “more words=more information=worse scores” entirely flattens the argument.
Richards rightly points out something our analysis did not account for: the State Department reports are separated into sections according to types of violations, with new sections added as its mandate expanded. Thus, the variation in information about physical integrity rights may not be as wide-ranging as is suggested by overall word counts of State Department entries. Amnesty International’s mandate has also broadened, but its report length has not varied nearly as much. Although we are not in a position to test such a hypothesis now, given the results of our exploratory analysis in the 2013 article we would postulate that eliminating the wide unsystematic variation by substituting specific torture section of the State Department reports in our analysis might strengthen the correlation of the report length with CIRI coding.
Richards, however, places much more emphasis on word count than our article did. We tested the relationship statistically while he did not, but we agree with him that word counts are not dispositive. Neither is the flatness of annual CIRI averages depicted in Richards’s Figures 1–4 particularly telling. In general, a global average, with countries getting worse and better over time, and with countries entering or exiting the data over time, is not necessarily a reliable representation of global human rights trends. For that reason, we elected not to present any general picture of global averages in our original article.3 Little can be concluded on the basis of aggregate averages, either global or regional. Our argument as presented in the case studies, however, is that likely improvement in physical integrity rights that can be documented in Latin America is hidden by information effects. This [End Page 494] is an argument that Christopher Fariss has also advanced and supported with new analysis using latent variable modeling.4
The disaggregated global averages of CIRI...