image Visit coronavirus.govVisit disclaimer page for the latest Coronavirus Disease (COVID-19) updates.
View ACF COVID-19 Responses and Resources

Designing an Evaluation: Resources

A checklist

Do you want to design an evaluation; or, do you need to identify someone who can design and conduct an evaluation for you? This page provides resources for organizations as they develop research plans to evaluate adolescent pregnancy prevention (APP) programs.

All high-quality research plans share a number of components, including a clearly articulated program model, research questions that are of interest to decision makers, and a study design and components that provide valid answers to these research questions. Each item below contains a set of resources that program managers and evaluators may want to consider when developing a research plan. The resources are organized into the following categories:

General Evaluation Planning

These resources describe how to develop research questions for different types of evaluations and how to collect data to address your research questions. The links below can assist you in articulating your research questions and deciding what measures and data to use to answer them.

Developing a Logic Model

A logic model is a visual tool for describing the components of a program and how these components are expected to be linked with intended outcomes. This section offers resources on how to develop an effective logic model based on the program’s theory of change and aligns with your research questions.

  • Logic Model Tip Sheet and Webinar from the Family and Youth Services Bureau that defines the basic components of a logic model, describes how to develop a logic model, and provides additional resources and references.
  • W.K. Kellogg Foundation Logic Model Development Guide explains what a logic model is, describes why it is useful, and walks through the process of developing a logic model to enhance program planning, implementation, and evaluation.

Choosing an Evaluator

These resources provide tips and considerations for choosing an external evaluator for your evaluation. An evaluator can help with designing the study, choosing outcome measures and methods, obtaining research approval, creating data security procedures, collecting and analyzing data, and reporting.

Designing a Rigorous Impact Evaluation

The Department of Health and Human Services (HHS) has established scientific standards to assess completed evaluations of programs designed to improve teenage outcomes related to sexual activity, contraceptive use, sexually transmitted infections, pregnancy, or births.  The resources listed below provide an overview of the HHS evidence review standards and discuss important factors to consider when planning a rigorous evaluation of program effectiveness that will meet these standards. 

Designing an Implementation or Process Study

An implementation (or process) study offers a scientific and objective approach for describing program services, including who receives services and what services they received. These resources define core concepts in implementation research and provide concrete steps for designing and conducting an implementation evaluation.

  • Conducting a Process Evaluation reviews core concepts in implementation research and describes key constructs to measure and steps to complete when planning and conducting an implementation study.
  • Measuring Implementation Fidelity defines implementation fidelity and its core components and discusses how it should be assessed as part of a process study.
  • Working with Subgrantees to Monitor Fidelity is for program managers and staff who are overseeing attempts to implement evidence-based adolescent pregnancy prevention (APP) programs. Participants will learn what fidelity means and how to promote the successful implementation of evidence-based programs in the real world. (Access is limited to APP grantees only.)
  • Developing Process Evaluation Questions outlines how to develop research questions for an implementation or process study.
  • The National Implementation Research Network Publications and Resources page provides a commonly used framework and resources for planning and conducting implementation research.
  • Continuous Quality Improvement Tip Sheet explains how QI differs from process evaluation, different types of CQI, and why CQI is important. It also provides suggestions and strategies for initiating, implementing, and building support for CQI within your organization.

Designing Other Outcome Evaluations

These resources address a variety of research methods that you can use to examine program outcomes, including experimental, quasi-experimental, and non-experimental designs. They focus on choosing the best design possible, even when resources or data are limited. They also present a number of strategies for strengthening non-experimental designs, which include pre-test/post-test designs, time-series designs, longitudinal studies, and post-test only designs.

Adapting a Program Model

If you plan to modify a program to address the needs of a particular community or context, this set of resources offers guidance and recommendations for adapting an existing effective program model without removing its core components or minimizing program outcomes.

Disseminating Evaluation Findings

  • Disseminating Evaluation Results Tip Sheet guides projects through the various decisions that need to be made, including what information to disseminate, the audience and format for these communications, and what partners and processes will support success.

Additional Considerations for Evaluation in Tribal Communities

Evaluation research may be viewed with skepticism in tribal communities, as previous research has often failed to recognize the sovereignty of native people or to draw on indigenous practices. This set of resources addresses some of the considerations for conducting program evaluations in tribal communities and outlines best practices for embarking on this type of work.

Comprehensive Evaluation Planning Guides

These resources aim to provide guidance on many issues faced during an evaluation, from the planning stage through data analysis and dissemination. They cover all of the topics addressed above, as well as some additional topics, including the logistics of data collection, sample retention and attrition, the cost of conducting an evaluation, frequently asked questions when designing an evaluation, and much more.

Comprehensive Needs Assessments

These resources describe how to conduct a thorough comprehensive needs assessment, from planning to data collection to dissemination to action steps as a result of the findings.

Back to Top