In lieu of an abstract, here is a brief excerpt of the content:

  • Resolving the Rigor versus Respect Dilemma in Community-Based Research: Commentary on Bess and Colleagues
  • Bruce D. Rapkin, PhD

Health behavior, Community-based participatory research, Evidence-based intervention

The article by Bess et al.1 in this issue of Progress in Community Health Partnerships demonstrates a high degree of rigor in the adaptation of an evidence based intervention. As these authors express so clearly, transformation of PREMIER2,3 into Heart Matters was a careful and thoughtful effort to resolve a persistent dilemma: the rules of research that demand fidelity to an evidence based intervention run counter to the values of community-based research that place the needs and preferences of stakeholders at the forefront.4,5 What I want to emphasize is that the methodology described here offers a resolution to this dilemma that maximizes both rigor and respect.6–8 Specifically, the approach these authors describe to modify interventions supports a broader understanding of the notion of evidence-based behavioral interventions.

What is an evidence-based intervention? The strongest tradition in the science of health behavior, adopted from experimental traditions in medicine and psychology, largely treats the manual or protocol as the intervention.7 Just as if it were a pharmaceutical product, the premise is that administration of an intervention must be identical to the originally tested and manualized version to achieve the best possible results. Lack of fidelity is generally presumed to diminish an intervention’s effectiveness. A softer version of this position recognizes that interventions may need to be adapted for pragmatic reasons. Pragmatic trials open the door for adapting protocols to new settings and populations, but fidelity to core elements must be preserved.9 However, there is a tacit sense that pragmatic trials are a necessary compromise that diminish the quality of data, loosening high internal validity to make necessary accommodations to the real world.

To me, the distinction between pragmatic trials and efficacy or explanatory trials is a bit misleading and off-putting. The original studies conducted to test the efficacy of evidence-based interventions all happen somewhere, and are implemented within the confines and resources of these initial settings. Yet person, place, time, and history are all assumed to be noise that washes away with randomization. The ability to demonstrate a significant effect is taken to mean that findings can be generalized to a population. However, that platonic population itself is amorphous and ill-defined. Ignoring how context always shapes interventions has resulted in the minimal regard for external validity.

As Bess et al.’ s article1 amply demonstrates, the very act of engaging in a participatory process to determine how the intervention needs to be changed is a scientific exercise in its own right. They show how community and academic investigators worked in a systematic way to identify how content and process directed by the protocol fit with the community and context. The ensuing discussions allowed all stakeholders to elucidate and share their own ways of thinking about the problem of cardiovascular disease. It also shed light on the [End Page 397] best ways to interact and engage with the community and identified potential barriers. Data of this sort are critical to implementation, and ought to be part of the evidence base required to conduct and evaluate behavioral interventions.

As the authors explain, the assumption that fidelity to the intervention as received is problematic on a number of grounds. Without evidence for external validity, it is not reasonable to assume that everyone everywhere will respond to a given intervention in the same way. Beyond this issue, it is clearly worthwhile (dare I say rigorous?) to think beyond fidelity and even beyond pragmatic accommodations. Local settings may present avenues to effect or sustain change that could not have been anticipated by the original developers of an intervention. Conversely, successful implementation may require preparatory steps, capacity building or work-arounds that are not included in the core elements of an intervention.

The authors point out that translation of an evidence-based intervention into new settings does not have to depend on fidelity to the original protocol to achieve effectiveness. This insight is very important, because it suggests that an intervention effect is...


Additional Information

Print ISSN
pp. 397-400
Launched on MUSE
Open Access
Back To Top

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.