Overview of Key Findings: Project SPARK Landscape Analysis of Evaluation Technical Assistance to Build the Evaluation Capacity of Human Services and Related Programs

Publication Date: June 3, 2022
The first page of the brief, entitled "Overview of Key Findings: Project SPARK Landscape Analysis of Evaluation Technical Assistance to Build the Evaluation Capacity of Human Services and Related Programs"

Download Report

Download PDF (666.59 KB)
  • File Size: 666.59 KB
  • Pages: 6
  • Published: 2022

Introduction

Research Questions

  1. What is the landscape of evaluation TA? That is, what evaluation TA approaches exist, what are common issues and challenges associated with implementing evaluation TA, and how does evaluation TA vary based on focal population or context?
  2. What do we know about the effectiveness of evaluation TA?
  3. What can we learn from evaluation TA initiatives outside of human services and related programs or supported by non-federal funding? Do approaches or effectiveness vary by sector or funding source? Have other agencies examined these questions about the promise or effectiveness of evaluation TA initiatives?

Enhancing the capacity of human services organizations to conduct evaluations and apply findings can help those organizations improve service delivery and better meet the needs of the people they serve. Supporting Partnerships to Advance Research and Knowledge (Project SPARK), sponsored by the Administration for Children and Families (ACF), Office of Planning, Research, and Evaluation (OPRE), in consultation with the Office of Family Assistance, provides evaluation technical assistance (TA) to Temporary Assistance for Needy Families (TANF) and related programs.

As part of Project SPARK, ACF sought to understand the range of approaches to evaluation TA, including which initiatives are most promising for building evaluation capacity among evaluation TA participants. These insights can inform and improve future evaluation TA efforts. This overview summarizes the full report, which documents approaches and evidence of promise or effectiveness of evaluation TA initiatives and draws lessons for human services and related programs, with a focus on TANF and workforce development programs. In addition, the report proposes a definition of evaluation TA and a conceptual framework that specifies the common components of evaluation TA that aims to build participants’ evaluation capacity.  Finally, it draws lessons from the findings relevant for each focal audience. Key lessons include: it is important to provide more support for programs that participate in evaluation TA, and more research on the effectiveness of different evaluation TA strategies is needed.

Purpose

This overview summarizes the full report, which proposes a definition of evaluation TA and a conceptual framework for evaluation TA and documents approaches and evidence of promise or effectiveness of evaluation TA initiatives. Findings summarized in this overview can inform policy and practice for the following focal audiences: (1) providers of evaluation TA at the federal, state, tribal, and local levels; (2) practitioners at the state, tribal, and local level who might participate in evaluation TA; (3) researchers, including those studying federally funded evaluation TA initiatives; and (4) policymakers and evaluation TA funders.

Key Findings and Highlights

Landscape of evaluation TA (Research Question 1)

  • Defining evaluation TA. Our preliminary review of relevant literature and consultations with experts suggested that the concept of evaluation TA was neither clearly nor consistently defined in the research or practice literature. To fill this gap, we conducted a more extensive, systematic literature review and gathered additional input from experts to specify a definition and conceptual framework of evaluation TA that aims to build staff and institutional evaluation capacity of human services and related programs.
  • Guiding principles for designing and delivering evaluation TA. Engaging diverse perspectives and incorporating evaluation and analytic thinking into everyday organizational decision making were commonly used as guiding principles by the evaluation TA initiatives examined for the report and were identified by experts as important for designing and delivering evaluation TA.  
  • Evaluation TA strategies and topics. A common upfront strategy used by the evaluation TA initiatives examined for the report was conducting a collaborative needs assessment and using results to tailor the TA initiative. Other common evaluation TA strategies include workshops, direct coaching, mentoring, and consultations between the evaluation TA provider and participant. Common evaluation TA topics covered key stages in the evaluation life cycle.
  • Common issues and challenges. Evaluation TA providers and participants were commonly challenged by issues such as limited time for staff at the evaluation TA participant organization to engage with the evaluation TA, limited resources for evaluation TA participants to sustain practices developed through evaluation TA, and difficulty obtaining buy-in from leadership of the participant organization.

Effectiveness of evaluation TA (Research Question 2)

  • Limited rigorous research. There are few rigorous studies of the effect of evaluation TA on key outcomes including building evaluation capacity of evaluation TA participants. 
  • Some evidence of improvements in key outcomes for evaluation TA participants. Existing research shows some associations between evaluation TA and improvements in key outcomes, such as evaluation TA participants’ knowledge, skills, and abilities related to evaluation; development and use of evaluation tools and use of rigorous evaluation design and methods; intervention implementation fidelity; positive outcomes for intervention participants; and increased organizational commitment to evaluation.

Lessons from evaluation TA initiatives outside of human services and related programs or supported by non-federal funding (Research Question 3)

  • Evaluation TA guiding principles, topics, and strategies—as well as evidence of promise or effectiveness—do not seem to vary by sector or funding source. This landscape analysis did not uncover systematic differences in evaluation TA approaches or evidence of effectiveness or promise by whether the initiative was within or outside the human services sector or supported by federal or non-federal funding.
  • A broader literature synthesis could help distinguish differences by sector and funding source. To draw stronger conclusions on how evaluation TA approaches and evidence of promise or evidence of effectiveness of evaluation TA vary by sector or funding source would require review of additional literature, using a larger sample and a sampling strategy designed specifically to address these questions.

Methods

To document the landscape of evaluation TA initiatives and draw lessons for human services and related programs, we drew on the following: (1) telephone discussions with evaluation TA providers and developers representing 14 evaluation TA initiatives; (2) a series of three meetings and ongoing consultation with a diverse group of 10 evaluation TA experts including federal staff overseeing evaluation TA initiatives, developers and providers of evaluation TA, researchers, and state and local practitioners who provide or participate in evaluation TA; and (3) a systematic literature review.

Recommendations

Findings have implications for the evaluation TA and evaluation capacity building efforts of evaluation TA providers, practitioners, researchers, policymakers, and evaluation TA funders. Key lessons include the importance of more supports for programs that participate in evaluation TA and the need for more research on evaluation TA effectiveness.

Citation

Stanczyk, Alexandra, Mary Anne Anderson, Armando Yanez, and Lauren Amos. (2022). Overview of Key Findings: Project SPARK Landscape Analysis of Evaluation Technical Assistance to Build the Evaluation Capacity of Human Services and Related Programs. OPRE Report # 2022-87, Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Glossary

ACF:
Administration for Children and Families
OPRE:
Office of Planning, Research, and Evaluation
Project SPARK:
Supporting Partnerships to Advance Research and Knowledge
TA:
Technical Assistance
TANF:
Temporary Assistance for Needy Families