Abstract

Background: Creating strong partnerships with community organizations is essential to test implementation of evidence-based interventions. However, partners are often chosen based on convenience rather than capacity or diversity. Streamlined processes are needed to identify qualified, diverse, and invested partners to conduct community-based research.

Objectives: There is a gap in the literature on effective and efficient processes for recruiting partners. This paper aims to fill that gap by describing a novel approach for identifying a diverse group of community organizations to participate in research.

Methods: We used a Request for Partners (RFP) approach to recruit partners to participate in a hybrid implementation-effectiveness study of the Veggie Van mobile market model. The process included formative work to inform RFP development, creation of an external advisory committee, an intent-to-apply round, a full application round, and an inperson training and selection process. Data was collected to characterize applicant size, location, and experience; pre-post surveys were conducted to understand the training's utility.

Results: We received 59 intent-to-apply submissions and invited 28 organizations to apply: 17 submitted applications and 12 organizations were chosen as finalists. The process took approximately 8 months to recruit 9 organizations and 32 community sites across 5 states and increased understanding of the intervention and partner responsibilities.

Conclusions: An RFP process is familiar to many community organizations that compete for grant funding but may not have prior research experience. This process streamlined recruitment timelines, increased diversity, and cultivated community among organizations. It may also improve research transparency, study completion, and intervention fidelity.

Keywords

Implementation study, partner recruitment, community-based research, mobile markets, request for partners

Dissemination is a key tenet of community-based participatory research (CBPR), but there is limited literature addressing how to disseminate evidence-based community interventions.1 While the field of dissemination and implementation science focuses on the best ways to translate evidence-based interventions to new settings, the majority of early research in this field has been focused on clinical, rather than community settings.2 [End Page 159] Combining principles of dissemination and implementation and CBPR provides an opportunity for testing the implementation of evidence-based community interventions. To progress to implementation, efficacious public health interventions must be tested in their intended context with delivery by appropriate community organizations.35 Strong relationships with community organizations that can implement the intervention and participate in research are crucial to this process. While there is a breadth of research on the facilitators, barriers, and outcomes of community–academic research partnerships, there is limited literature detailing the characteristics and processes of forming these partnerships in the context of a community-based implementation study 6,7 or food retail partnerships, in this case mobile produce markets. This is likely because most partnerships are formed based on long-standing relationships and networking rather than systematic processes.810

Many factors need to be considered when recruiting community organizations to engage in implementation research, including organizational (e.g., budget, staffing) and individual factors (e.g., knowledge, opinions, expectations toward research).11 Despite recommendations for developing rationale, criteria, and procedures for adding new community partners,12 detailed documentation of how these decisions are made is lacking in the literature. Having a systematic process for recruitment that allows for greater partner diversity and improves transparency in community–academic partnerships may ultimately strengthen these partnerships. Furthermore, having a diverse group of implementation partners allows for better understanding of external validity of interventions and documentation of beneficial adaptation strategies.13

To fill a gap in the literature on systematic processes recruiting community organizations for implementation research, the goal of this paper is to describe a novel Request for Partners (RFP) process which was developed with the input of community stakeholders and an outside advisory committee and designed to streamline partner recruitment and produce a well-qualified, diverse, and invested group of partners. The RFP process is described within the context of an implementation study for a previously developed evidence-based intervention which was designed using principles of CBPR. In addition, we discuss challenges and benefits of the chosen approach and recommendations for future studies looking to use a similar process.

METHODS

Research Context

The RFP process described here was developed to recruit and select partners for a hybrid effectiveness–implementation study14 of the Veggie Van (VV), a mobile produce market model designed to improve fruit and vegetable intake and other related health outcomes in lower-income and under-served communities. The VV model addresses multiple dimensions of access to fresh produce by offering a variety of fresh, high-quality fruits and vegetables at a reduced cost. Produce is sold at convenient locations (i.e., community sites) already serving the target population of low-income families through complementary services (i.e., health clinics, community centers, etc.).15 The model was developed through a partnership between researchers and one community-based organization with the input of community stake holders and lower-income community members.15,16 The model was previously tested in a cluster-randomized controlled trial in partnership with 12 communities and found that it significantly increased fruit and vegetable consumption.17 Additional details on the history of the VV can be found in Appendix 1.

Given the effectiveness of the original VV, the research team and original community partner that ran VV developed a dissemination toolkit designed to guide other organizations through the step-by-step process of engaging community members and community organizations to design and launch a market. As the original VV intervention was developed with one community partner and implemented in one region, the next phase was to further test the model's effectiveness by testing if it could be implemented by other partners and communities. The methods for the next phase of research were developed based on feedback from partners involved in the initial efficacy study,17,18 as well as community organizations that were already operating mobile markets.19 However, to test the VV model and toolkit nationally, relationships with new partners needed to be formed.

Details of the current study are forthcoming, but the initial plan was to identify eight partners to operate mobile markets [End Page 160] following the VV model at four community sites each (32 communities total). Prospective partners were organizations with a mission to improve food access and/or serve populations with limited access to healthy food that were willing to start or expand a mobile market program following the VV model. In recruiting partners, the research team planned to identify organizations that were not currently running or had limited experience running a mobile market. We were also willing to work with existing markets interested in expanding and adopting the VV model. Each of the selected partners would receive technical assistance for operating a mobile market following the VV model and access to the VV toolkit. Chosen partners would receive funding (≤ $50,000 each) to offset the cost of adopting the VV model and participating in study-related data collection. In turn, each of the eight partners would be asked to identify four community sites to host the mobile market. Two of these sites would be randomized to be market sites, launching a market shortly after baseline data collection, while the other two would engage in a yearlong community engagement and planning process to host a market after a 1-year follow-up data collection period. As such, the team decided an RFP process that clearly described the VV model and partner expectations would help us identify the most appropriate candidates.

RFP Development Process

We developed the RFP process to mimic a process familiar to many community organizations (i.e., applying for a grant). It was informed by our team's previous experience with using an application and selection process to identify research partners for food systems work.20,21 The first step was to determine the selection criteria for our implementation partners; this was informed by interviews mobile market key informants.19 Next, we formed an advisory committee, external to the research team, to guide the RFP development and selection process. The study team used pre-existing connections to identify individuals with expertise that matched criteria identified by key-informants. All but one potential committee member accepted, instead recommending a team member he felt would be well-suited for the role. The final advisory committee included five individuals from different areas of the United States, and each received an honorarium of $1,000 for helping develop the RFP guidelines and participating in the selection process. The committee included the director of a non-profit that runs a mobile market, a farmer that serves lower-income communities, an unaffiliated university faculty member with expertise in community engagement, and equity, and two representatives from businesses focused on local food aggregation, sales, and logistics.

Establishing Inclusion and Selection Criteria for Implementation Partners

Our research team decided that potential partners should be limited to any organizations that primarily serve an urban population, as the VV model had been previously tested in an urban area. We also limited the study region to 20 states in the north and southeast to allow our research team (based in New York and North Carolina) to support the selected partners more easily. To inform the selection criteria, we conducted key informant interviews with 21 established mobile markets.19 Using the findings from these interviews, and the research team's collective experience with operating mobile markets, community engagement, and food systems work, we drafted the initial RFP questions and selection criteria (Table 1). The RFP was then reviewed and refined by the advisory committee.

RFP Release

The full RFP guidelines can be found in Appendix 2. The RFP guidelines indicated that selected partners could receive technical assistance and up to $50,000 to offset the costs of running a mobile market following the VV model and assisting with data collection. The proposed and actual timelines for the RFP process are shown in Table 2. The RFP was advertised through a variety of food-related listservs, grant databases, and professional contacts of the research team and advisory committee (Table 3). Invitations were also sent to the 17 organizations that were not eligible for the key informant interviews because they had not been in operation for at least two years. After release, potential applicants could submit clarifying questions and we held two informational webinars where the research team answered questions. As we wanted organizations with varying levels of experience with mobile markets and/or grant writing to feel comfortable applying, we emphasized availability to answer questions and support them through the process. [End Page 161]

RFP Selection Process

To be considered for the RFP, community organizations were asked first to complete an online "Intent to Apply" form (round 1). Next, we notified applicants whether or not they were invited to complete a full application (round 2). Finalists were selected from the full applications and invited to attend an in-person training and selection process (round 3). After the selection process, we asked finalists to submit budgets for review by the advisory committee (round 4). Last, we notified all finalists on whether or not they were chosen as partners.

Table 1. Request for Proposal Selection Criteria, Weights and Associated Questions
Click for larger view
View full resolution
Table 1.

Request for Proposal Selection Criteria, Weights and Associated Questions

Table 2. Proposed and Actual RFP Process Timeline
Click for larger view
View full resolution
Table 2.

Proposed and Actual RFP Process Timeline

Intent to Apply (Round 1)

Interested applicants were asked to complete a brief online submission form and provide the following information: type of organization, location, area served, mission, programs offered, target market, mobile market experience, [End Page 162] organizational reach, and the operating budget. For this initial round, our goal was to both remove organizations that clearly did not qualify before submission of a full application and to ensure that approximately 20 organizations would proceed to round 2.

Table 3. Proposed and Actual RFP Process Timeline
Click for larger view
View full resolution
Table 3.

Proposed and Actual RFP Process Timeline

First, the research team reviewed the applications and excluded applicants that did not meet the eligibility criteria (urban, state). Next, to get closer to the goal of 20 round 2 applicants, we removed any organizations whose missions were not aligned with the VV model or that had been operating a mobile market for more than two years, as the primary intent was to help start new mobile market programs. At this initial stage, all determinations on whether organizations met inclusion criteria were made by the research team. During round 1, the research team did not exclude any organizations with which they had a previous relationship based on mission or operating experience and instead asked the advisory committee to review these organizations. This decision was made to avoid the perception that the study team was showing favoritism to a previous partner over another.

Full Application (Round 2)

Any applicants that were not excluded in the first round were invited to submit a full application. The narrative portion of the application was limited to five single-spaced pages. Applicants were asked to answer eight questions related to the selection criteria (Table 1). All received applications were sent to the advisory committee for review. Each application had two reviewers. Each of the five advisory committee members reviewed approximately six to seven applications and shared their assessments with the rest of the committee. Reviewers were asked to declare any conflicts of interest and were not allowed to review any application with which they had a conflict. They then completed a scoring sheet for each application and rated each organization on a scale of 1 (poor) to 5 (excellent) for each of the selection criteria (Table 1). We averaged the scores from both reviewers and weighted them based on the selection criteria in Table 1 so that each organization would have a total possible score of 100.

The combined and weighted scores for each applicant were rank-ordered. To account for differences in scoring style, each reviewer was asked to indicate the top two applicants they wanted to proceed to the final round with a goal to identify 12 finalists to advance forward. The advisory committee met as a group with a member of the research team that facilitated the discussion. The two most highly-rated applicants advanced to the next round without objection. Committee member discussion focused on applicants with more than a 10-point discrepancy in scores between reviewer 1 and reviewer 2. Throughout the discussion, reviewers could adjust (increase/ decrease) or keep the same score they awarded initially. Once scores were recalculated, a clear divide emerged between the top 10 and the rest of the applicants. However, two lower-scoring applicants were selected to proceed because they received a "top two" designation by at least one reviewer.

In-person Training and Selection Process (Round 3)

Finalists were invited to attend the in-person selection process and training to learn more about the VV study, model, and the requirements of being a partner. The goal of this [End Page 163] process was for both the advisory committee to evaluate the finalists and the finalists to determine if the partnership was a good fit for their organizations. Before attending the inperson training session, finalists submitted a budget proposal (maximum $50,000) detailing how they intend to spend the funds if awarded. They were also asked to indicate within the budget any in-kind sources that they had to support the project.

A 1-day training and selection meeting was offered in conjunction with an optional 2-day Mobile Market Summit that was hosted by our team but open to all mobile market practitioners. The research team paid travel expenses for the training for up to two members from each of the finalist organizations. The training included several sessions providing an overview of the VV model, study and partner requirements, and ample opportunity for questions. The selection process included 10-minute Power-Point presentations from each finalist, divided into two groups. The advisory committee highlighted areas in finalist applications that were unclear or underdeveloped, which informed our recommendations of topics for finalists to focus on in their presentations. Finalists from smaller, less experienced organizations presented in the morning and finalists from larger, more experienced organizations presented in the afternoon. After each set of presentations, finalists met privately with the advisory committee so the committee could ask questions about the applicants' presentations, applications, and budgets. Each meeting was limited to 10 minutes with the advisory committee, but finalists also had the opportunity to sign up for one-on-one technical assistance meetings throughout the day with VV team experts and advisory committee members. The study training closed with a budget session to provide guidance on developing final budgets.

Final Selection Process (Round 4)

Finalists were asked to revisit their budgets based on the information learned at the training and feedback from the advisory committee and submit a final budget three weeks after the in-person training. Advisory committee members submitted another round of scores for each finalist based on the in-person presentation and proposed final budget. The criteria for this round of scoring were designed to provide equality across different organizational sizes and the value of having a diverse group of partners (not just the largest or most experienced). Specifically, we asked each advisory committee member to rate each partner on a scale of 1 (poor) to 5 (excellent) in four areas: (1) preparedness and completeness of presentations, (2) partnerships and community sites, (3) responses in the post-presentation question and answer session, and (4) final budget. One advisory committee member could not attend the training due to a canceled flight and reviewed the PowerPoint slides for the presentations at a later date, but did not provide scores for the question and answer session. Final scores from all section committee members were summed across all finalists.

During the selection process, the study's budget was reduced, so that only 6 partners could be supported. However, during that time, we obtained additional funding to support two potential partners in North Carolina; as the funder chose specific applicants to support, those partners were automatically accepted and the competitive selection process for the other six spots continued without those applicants in the rankings.

Community Site Selection and Randomization

Once partners were selected, we worked with them to engage community sites and members in starting a mobile market following the VV model. While we provided resources via the VV toolkit and technical assistance, each organization determined how they would adapt the model to their community context and community feedback. Together we developed a timeline for when each community site would begin the community engagement and participant recruitment process. Each partner chose two communities to start first and they became a pair for randomization purposes. Two months before each pair was scheduled to start the community engagement process, each pair of sites was randomized to be a market site or 12-month planning site. Once randomized, the partner was responsible for communicating with their community sites about the market and study. The research team provided suggested communications about the study and a sample memorandum of understanding that could be used. After the partner and the site finalized the memorandum of understanding terms, it was signed by the research team, partner, and community site. Site randomization, community engagement, and participant recruitment are ongoing and will be reported elsewhere. [End Page 164]

RESULTS

Figure 1 details the number of applicants received and advanced through each stage of the process. We received 59 intent to apply submissions, of which we invited 28 applicants to complete a full application. Thirteen applicants were rejected because they served predominately rural areas, and 12 applicants because they had been operating a mobile market for 2 or more years. An additional six applicants were removed because their organizational mission and population served did not align with the VV model. Six applicants that had previous relationships with the study team were not reviewed during this round. Of the 28 invited applicants, 17 submitted full applications, and 12 were invited to be finalists and attend the in-person selection process, training, and Mobile Market Summit. The entire process took approximately 8 months (Table 2).

Figure 1. Organizations included in each round of the request for proposals process.
Click for larger view
View full resolution
Figure 1.

Organizations included in each round of the request for proposals process.

Final Scores and Selection

After excluding the two North Carolina applicants who were already awarded separate funding from the finalist process, nine finalists remained. One organization was eliminated because the advisory committee members unanimously recommended against it. The remaining eight finalists were ranked based on final scores, and a divide emerged between the top five and the bottom three; therefore, we decided to accept the top five, which left one remaining slot. It was noted that sixth and seventh ranked were smaller in size than other finalists and had less capacity, but would increase the study population's diversity. Thus, the PI recommended we accept both of these as provisional partners. The remaining (eighth ranked) finalist was eliminated. The provisional partners were offered $25,000 (instead of $50,000) and only had to complete recruitment and data collection with two community sites (rather than four) and were only expected to start one market (rather than two). The other finalists (ranked first to fifth) were invited to be study partners and offered $50,000 in funding each to support their efforts.

Applicant Characteristics

Characteristics of the mobile market applicants are included in Table 4. Small organizations with an operating budget of less than $250,000 per year represented the majority of intent to apply submissions (40%). Five of the nine selected partners (56%) were categorized as small organizations. The majority of the intent to apply submissions (78%) and final partners (78%) were non-profit organizations. The other two partners represented a university and a small business. Among those that submitted the intent to apply, 37.3% of applicants had never run a mobile market. Of the final partners, three organizations (33%) had never run a mobile market. The majority (59%) of intent to apply submissions and six selected partners (66%) came from our study area's northern region.

Training Evaluation

A survey was administered pre- and post-training to collect feedback from finalists on mobile market experience and [End Page 165] operations, clarity of partner expectations, and knowledge gained at both the mobile market summit and training. Representatives from 9 of the 11 applicant organizations (n = 16 individuals) completed the 54-item pre-training survey that was developed with the mobile market summit planning committee which consisted of practitioners and academics. Pre-training survey responses indicated that the main reasons organizations chose to apply to the RFP was an interest in evaluating their program impact (80%), desire to reach more customers (73.3%), and to help the market become more financially sustainable (60%). Before the training, respondents from applicant organizations were most interested in learning how the VV model could help their mobile market become sustainable (93.3%).

Table 4. Request for Partners Applicant Characteristics
Click for larger view
View full resolution
Table 4.

Request for Partners Applicant Characteristics

Representatives from all 11 applicant organizations (n = 12 individuals) completed the post-training survey. Following the training, all finalists reported that the VV model would be a good (18.2%) or great fit (81.8%) for their organization. There was an overall increase in understanding of the VV model among finalists; 63.6% of respondents reported being very to extremely familiar with the model (compared to 28.6% pre-training). The majority also reported a high rate of clarity around partner responsibilities after attending with 72.7% of respondents reporting that they felt the community engagement responsibilities were very clear and 27.3% felt they were somewhat clear. After training, expectations around data collection ranged from somewhat clear (60%) to very clear (40%).

DISCUSSION

There is limited research discussing how to recruit community partners for implementation research. While previous studies have used an RFP process to recruit research partners,21 we are not aware of any literature documenting the development, execution, or effectiveness of that process. In reflecting upon the RFP process, we note several benefits and lessons learned. Benefits include a shorter recruitment timeline compared to our previous study,18 improved understanding of partner responsibilities, better organizational fit, and an expanded and more diverse partner network. We also recognize areas for improvement, including better alignment of partners' timelines to study needs and recognize the need to select partners based on capacity for both implementing [End Page 166] the intervention (e.g., VV model components) and carrying out research activities (e.g., assist with recruitment of study participants).

When researchers engage new community organizations for research purposes, extended timelines are common; specifically, time is needed to develop trust, identify champions, and manage conflicting agendas with existing initiatives.18 For the current study, the initial recruitment process took approximately 8 months to recruit 9 organizations and 32 community sites instead of the 38 months required to recruit 12 community sites for the previous efficacy study.18

Our previous research has indicated that mismatch between research and community partner timelines and a lack of understanding of randomization can lead to site and participant recruitment and retention challenges.18 The multistep RFP process, which included informational webinars and ongoing support, allowed the partners, study team, and advisory committee to evaluate organizational fit. Specifically, the 1-day training and selection meeting was a unique and crucial event for clarifying expectations, ensuring organizational alignment with the VV model, and understanding the applicants' capabilities.

One limitation of the RFP process was that the detailed partner requirements may have seemed overly prescriptive and obscured the nature of the partnership. The RFP document was intended to be very upfront about expectations as our advisory committee and original partners recommended we use this approach. While it is possible that this may have deterred some partners from reaching out, individual meetings with prospective partners allowed the research team to convey that, in practice, decisions about how the mobile markets would be run by each partner were based on what made sense for their business with the goal of achieving the VV model when possible. The VV team could not require partners to engage in specific activities, but rather would work with each partner individually to identify the best ways to engage community members, plan for the market and troubleshoot issues. All adaptations and deviations from the original VV model will be documented and described in future publications.

Another limitation of this method was that we only sought partners who were operating in urban areas. Despite this, we had many applications from rural markets and interest in expanding the model to this area so we believe that this RFP method could also be used in non-urban areas. An unintended benefit of the RFP process was a significant expansion of the research team's partner network. The RFP process expanded the diversity of potential partners in organizational characteristics (size/location/type) and the population served. In addition to new partners selected for the study, we identified many mobile markets through our initial outreach survey used to identify key informants and through the RFP dissemination process. These potential partners were added to our listservs, and many have continued to participate in networking and training events offered by our team, including the annual Mobile Market Summit. Notably, they have also become engaged in subsequent implementation research with our study team. The training and summit also catalyzed networking and information sharing among mobile markets; we are currently working with several of these partners to create a mobile market coalition to formalize this community of practitioners. Funders looking to support mobile markets have also seen the RFP process's value and have provided additional support to fund more applicants.

Based on our experience, we would recommend a few adjustments to researchers looking to adopt a similar recruitment and selection process. First, we would recommend a flexible selection process that takes into consideration more than just scores. Although we had clear criteria and a scoring rubric for advisory committee members to review applications, we did not have a predetermined process for how we would use the scores to make final selections. After speaking with the advisory committee about their scoring rationale, we decided to use rankings in addition to the average scores. For example, one organization was not selected as a final partner despite high average scores because the advisory committee ranked them lower as they were very large and experienced and may not benefit from the funding and technical assistance to the same degree as other organizations.

Our second recommendation would be to ensure that the RFP selection timeline matches with study capacity. All partners were accepted into the study at the same time. However, limitations in the study team's capacity to train partners and support their community engagement efforts as well as collect data from study participants at each site necessitated that some partners could not launch their markets until almost a year [End Page 167] later. For similar reasons, previous implementation studies have recommended recruiting partners on a rolling basis.10 Alternatively, there could be several staggered deadlines throughout the year, with only two or three new partners selected at each time. This would reduce the administrative burden on the research team and allow for replacing partners that may drop-out.

Last, a larger percentage of small organizations were selected as partners compared to large or medium-sized organizations. While we did not expressly exclude larger partners, the advisory committee was charged with choosing a diverse group of partners, not just the largest or most experienced organizations. However, this may have resulted in partners with less organizational capacity to engage in the project's research aspects. When designing the RFP, the research team felt that an organization's strengths in engaging with lower-income communities and operating a mobile market would translate into better recruitment for the study. In the future, we would recommend explicitly evaluating capacity for research participation, not just to implement the intervention. For example, including selection criteria related to previous program evaluation experiences could indicate the partners capacity for data collection. Considering number of staff who are dedicated to the program might be another approach to measuring capacity.

In addition to the benefits mentioned, using an RFP process for recruitment of partners increased transparency in partner selection and has the potential to cultivate and strengthen academic-community partnerships for research. We anticipate this process may also reduce the likelihood of study partner drop-out. While we used the RFP to identify partners, who in turn identified community sites, researchers could also use this process to recruit sites directly. Our study provided funding for partners, as this is considered best practices for community-based research, but providing technical assistance may also provide value to study partners. Other studies, such as Growing Food Connections,20 have successfully recruited partners using an RFP for technical assistance but not funding. At a minimum, we hope that this model will encourage implementation researchers to be more thoughtful and systematic about partner selection and publish their partner selection processes' methods and results. [End Page 168]

Supplementary Material

Click here to view supplementary appendix. (PDF)

Lucia A. Leone
Department of Community Health and Health Behavior, School of Public Health and Health Professions, University at Buffalo
Christina Kasprzak
Department of Community Health and Health Behavior, School of Public Health and Health Professions, University at Buffalo
Anne Lally
Department of Community Health and Health Behavior, School of Public Health and Health Professions, University at Buffalo
Lindsey Haynes-Maslow
Department of Agricultural and Human Sciences, North Carolina State University
Leah N. Vermont
Department of Community Health and Health Behavior, School of Public Health and Health Professions, University at Buffalo
Caroline Horrigan-Maurer
Department of Family Medicine, Jacobs School of Medicine and Biomedical Sciences, University at Buffalo
Laurene Tumiel-Berhalter
Department of Community Health and Health Behavior, School of Public Health and Health Professions, University at Buffalo
Alice Ammerman
Department of Nutrition, University of North Carolina at Chapel Hill, Center for Health Promotion and Disease Prevention
Samina Raja
Department of Urban and Regional Planning, School of Architecture and Planning, University at Buffalo
Corresponding Author: Lucia A. Leone, PhD, Department of Community Health and Health Behavior, School of Public Health and Health Professions, 333 Kimball Tower, University at Buffalo, Buffalo, NY 14214. E-mail: lucialeo@buffalo.edu
Submitted 12 February 2021, revised 2 February 2022, accepted 21 March 2022.

ACKNOWLEDGMENTS

Funded by the National Cancer Institute (R37CA215232). The authors thank our advisory committee and all of the organizations that participated in the Request for Partners process. They also thank Jill Tirabassi and Angelica Tutasi for providing critical feedback on this work.

REFERENCES

1. Delafield R, Hermosura AN, Ing CT, Hughes CK, Palakiko DM, Dillard A, et al. A community-based participatory research guided model for the dissemination of evidence-based interventions. Prog Community Health Partnersh. 2016; 10(4):585–95.

2. Holt CL, Chambers DA. Opportunities and challenges in conducting community-engaged dissemination/implementation research. Transl Behav Med. 2017;7(3):389–92.

3. Milat AJ, Bauman A, Redman S. Narrative review of models and success factors for scaling up public health interventions. Implement Sci. 2015;10(1):113.

4. Glasgow RE, Lichtenstein E, Marcus AC. Why don't we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Public Health. 2003;93(8):1261–7.

5. Flay BR, Biglan A, Boruch RF, Castro FG, Gottfredson D, Kellam S, et al. Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prev Sci. 2005;6(3):151–75.

6. Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, et al. Community–academic partnerships: A systematic review of the state of the literature and recommendations for future research. Milbank Q. 2016;94(1):163–214.

7. Wright MT, Roche B, von Unger H, Block M, Gardner B. A call for an international collaboration on participatory research for health. Health Promot Int. 2009;25(1):115–22.

8. Pinto RM, Wall MM, Spector AY. Modeling the structure of partnership between researchers and front-line service providers: Strengthening collaborative public health research. J Mixed Methods Res. 2014;8(1):83–106.

9. Pinto RM, Witte SS, Wall MM, Filippone PL. Recruiting and retaining service agencies and public health providers in longitudinal studies: Implications for community-engaged implementation research. Methodol Innov. 2018;11(1):2059799118770996.

10. James AS, Richardson V, Wang JS, Proctor EK, Colditz GA. Systems intervention to promote colon cancer screening in safety net settings: Protocol for a community-based participatory randomized controlled trial. Implement Sci. 2013;8(1):58.

11. Pinto RM, Spector AY, Witte SS, Gilbert L. Systematizing planning and formative phases of HIV prevention research: Case studies from Brazil, Mongolia, and Kazakhstan. Global Social Welfare. 2014;1(3):137–44.

12. Seifer SD. Building and sustaining community-institutional partnerships for prevention research: Findings from a national collaborative. J Urban Health. 2006;83(6):989–1003.

13. Green L, Nasser M. Furthering dissemination and implementation research: The need for more attention to external validity. Dissemination and Implementation Research in Health: Translating Science to Practice. New York: Oxford University Press. 2012.

14. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50(3):217–26.

15. Leone LA, Haynes-Maslow L, Ammerman AS. Veggie Van pilot study: Impact of a mobile produce market for under-served communities on fruit and vegetable access and intake. J Hunger Environment Nutr. 2017;12(1):89–100.

16. Haynes-Maslow L, Auvergne L, Mark B, Ammerman A, Weiner BJ. Low-income individuals' perceptions about fruit and vegetable access programs: A qualitative study. J Nutr Educ Behav. 2015;47(4):317–24 e1.

17. Leone LA, Tripicchio GL, Haynes-Maslow L, McGuirt J, Grady Smith JS, Armstrong-Brown J, et al. Cluster randomized controlled trial of a mobile market intervention to increase fruit and vegetable intake among adults in lower-income communities in North Carolina. Int J Behav Nutr Phys Activity. 2018;15(1):2.

18. Tripicchio GL, Grady Smith J, Armstrong-Brown J, McGuirt J, Haynes-Maslow L, Mardovich S, et al. Recruiting community partners for Veggie Van: Strategies and lessons learned from a mobile market intervention in North Carolina, 2012–2015. Prev Chronic Dis. 2017;14:E36.

19. Kasprzak C, Schoonover J, Gallicchio D, Haynes-Maslow L, Vermont L, Ammerman A, et al. Using common practices to establish a framework for mobile produce markets in the United States. Journal of Agriculture, Food systems, and Community Development. 2021;10(4):73–84.

20. Raja S, Whittaker J, Hall E, Hodgson K, Leccese J. Growing food connections through planning: Lessons from the United States. In: Cabannes Y, Marocchino C, editors. Integrating food into urban planning. London: UCL Press; 2018. p. 134–53.

21. Clark JK, Freedgood J, Irish A, Hodgson K, Raja S. Fail to include, plan to exclude: Reflections on local governments' readiness for building equitable community food systems. Built Environment. 2017;43(3):315–27.

Appendix 1. HISTORY OF THE VEGGIE VAN STUDY AND COMMUNITY ENGAGEMENT

Research Phase Study Overview Community Engagement

Community Nutrition Partnership (CNP) is the nonprofit organization that developed and ran the first Veggie Van mobile market in North Carolina. CNP stopped running a mobile market in 2016, but CNP leadership remained active in development of the Veggie Van model and associated toolkit (www.myveggievan.org)

Pilot Study (2011–2013) N=59 The Veggie Van program was tested at three community locations in North Carolina using a pre-post design. At follow-up, individuals who reported shopping at Veggie Van frequently (n = 32) increased their F&V consumption by 0.41 servings/ day compared with a decrease of −1.19 for those who rarely/never used Veggie Van (n = 27), a total difference of 1.6 servings/day (P = .01) (1). The Veggie Van program was developed and run by the North Carolina-based non-profit organization Community Nutrition Partnership (CNP).1 The program was developed using community and stakeholder input (2, 3). Researchers evaluated the program.
Efficacy Study (2012–2015) N=142 The Veggie Van program was tested in a clusterrandomized controlled trial in 12 communities in North Carolina. Communities received the mobile market right away (intervention) or 6-months after randomization (delayed intervention control) (4). At follow-up, adjusted change in F&V consumption was 0.95 cups/day greater for intervention participants (p = 0.005), but was attenuated to 0.51 cups per day (p = 0.11) after removing extreme values (5). CNP and the research team jointly wrote the grant and recruited communities. CNP delivered the mobile market intervention and assisted with participate recruitment. The research team collected data via phone survey.
Effectiveness Study (Ongoing) N=960 Proposed The Veggie Van model will be tested in 32 communities across the Eastern United States using a cluster-randomized controlled trial. The study is ongoing; the current paper describes the Request for Partners (RFP) process to identify new partners interested in working with researchers to test the Veggie Van model in new communities which they identify and recruit. Researchers designed the study based on lessons-learned from the efficacy study and input from experienced mobile market operators (6, 7). The RFP process was guided by an outside advisory committee. New partners will recruit communities, recruit participants, and deliver the intervention following the Veggie Van model with the support of the research team. Partners are encouraged to develop community advisory committees to oversee their program. The research team and partners will both collect data from participants. Data analysis will be conducted by the research team with partner input.
Implementation Study (Ongoing) During the effectiveness study, researchers will collect data via interviews, focus groups, and process measure surveys. Data will be collected from staff at partner organizations that operate the mobile markets, community sites that host the mobile markets and community members who use the markets to understand how Veggie Van model is being adapted and implemented across communities. Feedback from community partners and community members will be used to iteratively improve the Veggie Van model for future dissemination.

Appendix 1 References

1. Leone LA, Haynes-Maslow L, Ammerman AS. Veggie Van Pilot Study: Impact of a Mobile Produce Market for Underserved Communities on Fruit and Vegetable Access and Intake. Journal of Hunger & Environmental Nutrition. 2017;12(1):89–100.

2. Haynes-Maslow L, Auvergne L, Mark B, Ammerman A, Weiner BJ. Low-Income Individuals' Perceptions About Fruit and Vegetable Access Programs: A Qualitative Study. J Nutr Educ Behav. 2015;47(4):317–24 e1.

3. Haynes-Maslow L, Parsons, Sarah E., Wheeler, Stephanie B., and Leone, Lucia A. . A Qualitative Study of Perceived Barriers to Fruit and Vegetable Consumption Among Low-Income Populations, North Carolina, 2011. Preventing Chronic Disease. 2013;10:1–10.

4. Leone LA, Tripicchio GL, Haynes-Maslow L, McGuirt J, Grady Smith JS, Armstrong-Brown J, et al. A Cluster-Randomized Trial of a Mobile Produce Market Program in 12 Communities in North Carolina: Program Development, Methods, and Baseline Characteristics. J Acad Nutr Diet. 2019;119(1):57–68.

5. Leone LA, Tripicchio GL, Haynes-Maslow L, McGuirt J, Grady Smith JS, Armstrong-Brown J, et al. Cluster randomized controlled trial of a mobile market intervention to increase fruit and vegetable intake among adults in lower-income communities in North Carolina. International Journal of Behavioral Nutrition and Physical Activity. 2018;15(1):2.

6. Tripicchio GL, Grady Smith J, Armstrong-Brown J, McGuirt J, Haynes-Maslow L, Mardovich S, et al. Recruiting Community Partners for Veggie Van: Strategies and Lessons Learned From a Mobile Market Intervention in North Carolina, 2012–2015. Prev Chronic Dis. 2017;14:E36.

7. Kasprzak C, Schoonover J, Gallicchio D, Haynes-Maslow L, Vermont L, Ammerman A, et al. Using common practices to establish a framework for mobile produce markets in the United States. Journal of agriculture, food systems, and community development. 2021;10(4):73–84.

Share