“We Get a Chance to Show Impact”, Program Staff Reflect on Participating in a Rigorous, Multi-site Evaluation

Publication Date: March 7, 2019
Current as of:
“We Get a Chance to Show Impact”,

Download Report

Download PDF (341.34 KB)
  • File Size: 341.34 KB
  • Pages: N/A
  • Published: 2019

Introduction

This brief summarizes findings from interviews conducted with leadership and staff from eight programs that participated in the Pathways for Advancing Careers and Education (PACE) Evaluation, a rigorous, multi-site evaluation of “career pathways” programs.

Purpose

Interviewees describe their experiences implementing the evaluation procedures, the benefits of participating in a randomized controlled trial (RCT) study, how they overcame challenges to participating in an RCT, and the lessons they learned along the way. These reflections offer insights for programs considering participation in a similar evaluation: What factors weighed in the decision to participate? How can a program proactively identify and address potential challenges? Interviewees also highlight important considerations for evaluation teams, particularly when recruiting programs to participate: What benefits do programs find most valuable? How can evaluators best provide support?

Key Findings and Highlights

  • Program leadership and staff found value in participating in a RCT as a way to build evidence of program effectiveness and identify areas for improvement. Increasingly, funders look for evidence of effectiveness from a rigorous study when awarding funding.
  • A variety of program stakeholders initially had concerns about the random assignment methodology. Respondents shared a number of strategies for addressing these concerns, such as facilitating multiple stakeholder meetings and being flexible about which partner organization conducted randomization.
  • Program leaders invested time meeting with frontline staff to ensure they understood the purpose of the study, describe their key roles in it, and address any concerns. They hired or assigned staff open to the study’s purpose and procedures.
  • Participant recruitment was more difficult than anticipated. Staff in all programs noted they needed to increase the number of program applicants to build a study control group, and many also had to scale up their program to ensure enough treatment group slots. This required dedicated recruitment staff, tracking the effectiveness of recruitment strategies, and continually communicating with referral partners.
  • Peer-to-peer learning was an important support for program staff. PACE included in-person partners meetings and other peer-to-peer events, building community and capacity among program partners.
  • A number of staff reported their experience in the study produced unanticipated positive changes within their organization, such as enhanced research and evaluation capacity and more efficient procedures.

Methods

The evaluation team collected information for this brief through telephone discussions with program staff at eight of the nine PACE programs. These discussions occurred in late 2017, three to four years after the end of random assignment. Discussants’ roles varied by program, but typically included current or former program directors and staff involved in implementing the PACE evaluation.

Citation

Hamadyk, J. and K. Gardiner (2018). “We Get a Chance to Show Impact”: Program Staff Reflect on Participating in a Rigorous, Multi-site Evaluation, OPRE Report # 2018-123, Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.