Introduction
The Office of Planning, Research, and Evaluation (OPRE), within the Administration for Children and Families (ACF), offers an annual Evaluation and Monitoring 101 training to develop ACF staff skills to assess program performance and impact. This brief describes the training’s history, content, and learning outcomes, as well as OPRE’s approach to continuously improving the training based on participant feedback. The brief also orients readers to OPRE’s program assessment framework, which is intended to help ACF staff understand which types of program assessment are most relevant based on the questions staff are most interested in.
Purpose
The purpose of this brief is to spread awareness about OPRE’s work, highlighting one way that OPRE supports ACF’s capacity to build and use evidence. The information in this brief may be useful for ACF staff who are considering participating in the training, ACF program leadership who are interested in sending staff to the training, and/or other agencies who are contemplating launching their own such training to build capacity for learning and evaluation.
Key Findings and Highlights
The Evaluation and Monitoring 101 training has evolved over nearly a decade to meet the changing needs of Administration staff and address a growing federal interest in evidence-based decision making. In the April 2022 training, learners participated in seven interactive sessions over the course of two days, guided by skilled facilitators from a range of OPRE divisions. Session topics included:
- Introduction to OPRE
- Logic models and research questions
- Performance measurement
- Continuous quality improvement
- Process and implementation evaluations
- Outcome and impact evaluations
- Planning and implementing program assessment approaches
The training rooted all content in a framework that distinguishes between two categories of program assessment: (1) Performance monitoring and improvement, which refers to ongoing monitoring of program performance, and (2) Program evaluation, which refers to systematic studies to assess how well programs are working. Each category of program assessment is associated with different methods that can help staff answer different types of questions to advance the ACF mission.
OPRE is committed to incorporating participant feedback to continuously improve the Evaluation and Monitoring 101 training and collects participant feedback both during and after each training course. OPRE staff report that the Evaluation and Monitoring 101 training has led to meaningful, sustained collaboration with program offices. The training may also serve as a model for other agencies looking to improve data collection and analysis practices to ultimately improve service delivery.
Methods
To better understand the history, context, and learning design of the Evaluation and Monitoring 101 training, the Mathematica team started by reviewing and analyzing available materials from past trainings, including past course materials and feedback forms. The Mathematica team also gathered historical information through a group discussion with OPRE staff who have been involved with the training design and implementation over time.
Then, a Mathematica team member participated in the April 2022 training to gain first-hand insight into the training content, facilitation methods, and learning outcomes. The team reviewed and analyzed themes of feedback forms from the 2022 training1 to understand the participant experience and assess which course components resonated most with learners. The Mathematica team is grateful to OPRE for sharing past and current course materials, offering insight into the training’s history, and supporting Mathematica’s participation in the 2022 training.
1 Data collected through the 2022 training feedback form occurred as part of OMB #0970-0401.
Citation
Alberty, Elizabeth, and Heather Zaveri. “Evaluation and Monitoring 101: The Office of Planning, Research, and Evaluation, within the Administration for Children and Families, builds staff skills to assess program performance and impact.” OPRE brief 2022-232. Washington, D.C., Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.
Related Documents
DiSalvo, C., and C. Bridges. “Building ACF Evidence Capacity with Evaluation and Monitoring 101.” OPRE Insights Blog, May 17, 2021. Available at https://www.acf.hhs.gov/opre/blog/2021/05/building-acf-evidence-capacity-evaluation-and-monitoring-101. Accessed May 2, 2022.
Glossary
- Monitoring and evaluation:
- Systematic methods for collecting, analyzing, and using information to answer questions about a program, to strengthen program operations, improve program effectiveness, or inform decisions about future program development
- Logic model:
- A tool to describe the resources, assumptions, implementation activities, and program outputs that link the intervention and target population to the intended short-term and long-term outcomes
- Performance monitoring and improvement:
- Ongoing monitoring and reporting of program performance
- Performance measurement:
- The routine measurement and reporting of data on program inputs, activities, outputs, and/or outcomes as part of program operations
- Continuous quality improvement:
- A systematic approach to identifying, describing, and analyzing strengths and problems and then testing, implementing, learning from, and revising solutions
- Program evaluation:
- Systematic studies to assess how well a program is working
- Process and implementation evaluation:
- A type of evaluation that describes and analyzes a program’s implementation (activities, inputs, and outputs)
- Outcome evaluation:
- A type of evaluation that identifies changes in participant knowledge, attitudes, and/or behaviors after participating in a program or service
- Impact evaluation:
- A type of evaluation that identifies the results or effects of a program by comparing those who received services against those who did not