ACF Evaluation Policy

Publication Date: November 9, 2021

Introduction

This evaluation policy builds on the Administration for Children and Families’ (ACF) strong history of evaluation by outlining key principles to govern our planning, conduct, and use of evaluation. This policy reconfirms our commitment to conducting rigorous, relevant evaluations and to using evidence from evaluations to inform policy and practice. ACF seeks to promote rigor, relevance, transparency, independence, and ethics in the conduct of evaluations. This policy addresses each of these principles.

ACF published the evaluation policy in the Federal Register on November 9, 2021: 86 FR 62175

Watch this video to learn about the policy’s principles and how we use them to guide our work:

 


Access the audio-descriptive version of this video 

 

The mission of the ACF is to foster health and well-being by providing federal leadership, partnership, and resources for the compassionate and effective delivery of human services. Our vision is children, youth, families, individuals and communities who are resilient, safe, healthy, and economically secure. The importance of these goals demands that we continually innovate and improve, and that we evaluate our activities and those of our partners. Through evaluation, ACF and our partners can learn systematically so that we can make our services as effective, efficient, and equitable as possible.

Evaluation produces one type of evidence. A learning organization with a culture of continuous improvement requires many types of evidence, including not only evaluation but also descriptive research studies, performance measures, financial and cost data, survey statistics, program administrative data, and feedback from service providers, participants, and other stakeholders. Further, continuous improvement requires systematic approaches to using information, such as regular data-driven reviews of performance and progress. Although this policy focuses on evaluation, the principles and many of the specifics apply to the development and use of other types of evidence as well.

This policy applies to all ACF-sponsored evaluations. While much of ACF’s evaluation activity is overseen by the Office of Planning, Research, and Evaluation (OPRE), ACF program offices also sponsor evaluations through dedicated contracts or as part of their grant-making. In order to promote quality, coordination and usefulness in ACF’s evaluation activities, ACF program offices will consult with OPRE in developing evaluation activities. Program offices will discuss evaluation projects with OPRE in early stages to clarify evaluation questions and methodological options for addressing them, and as activities progress OPRE will review designs, plans, and reports. Program offices may also ask OPRE to design and oversee evaluation projects on their behalf or in collaboration with program office staff.

 

Download the Policy (PDF)

 

Learn more about each of the five principles in the evaluation policy -- Rigor, Relevance, Transparency, Independence, and Ethics: 

Rigor

Rigor

 

ACF is committed to using the most rigorous methods that are appropriate to both the evaluation questions and the populations, circumstances, and settings that are the focus of study; and that are feasible within budget and other constraints. Rigor is not restricted to impact evaluations, but is also necessary in implementation or process evaluations, descriptive studies, outcome evaluations, and formative evaluations; and in both qualitative and quantitative approaches. Rigor requires ensuring that inferences about cause and effect are well founded (internal validity); requires clarity about the populations, settings, or circumstances to which results can be generalized (external validity); and requires the use of measures that accurately capture the intended information (measurement reliability and validity).

In assessing the effects of programs or services, ACF evaluations will use methods that isolate to the greatest extent possible the impacts of the programs or services from other influences such as trends over time, geographic variation, or pre-existing differences between participants and non-participants. For such causal questions, experimental approaches are preferred. When experimental approaches are not feasible, high-quality quasi-experiments offer an alternative. ACF will develop and use methods that are appropriate for understanding diverse populations, taking into account historical, contextual, and cultural factors. Where possible, evaluations will design data collections to allow disaggregation of data and analyses of sub-groups to support understanding of equity. 

ACF will recruit and maintain an evaluation workforce with the knowledge, training, and experience appropriate for planning and overseeing a rigorous evaluation portfolio. To accomplish this, ACF will recruit staff with advanced degrees and experience in a range of relevant disciplines such as program evaluation, policy analysis, economics, sociology, child development, etc.  ACF will recruit staff with a range of backgrounds, lived experiences, and perspectives and with expertise in approaches appropriate for studying diverse populations. ACF will provide professional development opportunities so that staff can keep their skills current.

ACF will ensure that contractors and grant recipients conducting evaluations have appropriate expertise through emphasizing the capacity for rigor in requests for proposal and funding opportunity announcements. This emphasis entails specifying expectations in criteria for the selection of grant recipients and contractors, and engaging reviewers with evaluation expertise. It also requires allocating sufficient resources for evaluation activities. ACF will generally require evaluation contractors to consult with external advisors who are leaders in relevant fields and who represent diverse backgrounds, lived experiences, and perspectives through the formation of technical work groups or other means; and to meaningfully engage stakeholders from programs and communities being studied throughout the evaluation lifecycle.

Relevance

Relevance

 

Evaluation priorities should take into account legislative requirements and Congressional interests and should reflect the interests and needs of ACF, HHS, and Administration leadership; ACF program office staff and leadership; ACF partners such as states, territories, tribes, and local grant recipients; service providers; the populations served; researchers; and other stakeholders. Stakeholders should have the opportunity to influence evaluation priorities to meet their interests and needs. Evaluations should be designed to examine questions relevant to the diverse populations that ACF programs serve, such as Black, Latino, and Indigenous and Native American persons, Asian Americans and Pacific Islanders and other persons of color; members of religious minorities; lesbian, gay, bisexual, transgender, and queer (LGBTQ+) persons; persons with disabilities; persons who live in rural areas; and persons otherwise adversely affected by persistent poverty or inequality.  ACF will encourage diversity among those carrying out the work, through building awareness of opportunities and building evaluation capacity among under- represented groups. ACF will use inclusive and participatory practices in each phase of evaluation planning, execution, and dissemination, as appropriate and feasible.

There must be strong partnerships among evaluation staff, program staff, policy-makers and service providers. Further, for new initiatives and demonstrations in particular, evaluations will be more feasible and useful when planned in concert with the planning of the initiative or demonstration, rather than as an afterthought. Given federal requirements related to procurement and information collection, it can take many months to award a grant or contract and begin collecting data. Thus it is critical that planning for research and evaluation be integrated with planning for new initiatives.

It is important for evaluators to disseminate findings in ways that are accessible and useful to policy- makers, service providers, the communities that ACF serves, and other stakeholders. OPRE and program offices will work in partnership to disseminate information about our research and evaluation activities and findings in a manner that is clear, accessible, and useful to our diverse range of audiences; this includes using plain language, using inclusive language, adhering to principles of clear communication, and developing products accessible to people with disabilities. ACF will require contractors to meaningfully engage stakeholders from the programs and communities involved in studies to improve clarity of presentations, accuracy of interpretations, and effectiveness of dissemination activities.

It is ACF’s policy to integrate both use of existing evidence and opportunities for further learning into all of our activities. Where an evidence base is lacking, we will build evidence through strong evaluations. Where evidence exists, we will use it. Discretionary funding opportunity announcements will require that successful applicants cooperate with any federal evaluations if selected to participate. As legally
allowed, programs with waiver authorities should require rigorous evaluations as a condition of waivers. As appropriate, ACF will encourage, incentivize or require grant recipients to use existing evidence of effective strategies in designing or selecting service approaches. The emphasis on evidence is meant to support, not inhibit, innovation, improvement, equity, and learning.

Transparency

Transparency

 

ACF will make information about planned and ongoing evaluations easily accessible, typically through posting on the web information about the contractor or grant recipient conducting the work and descriptions of the evaluation questions, methods to be used, and expected timeline for reporting results. ACF will present information about study designs, implementation, and findings at professional conferences.

Study plans will be published in advance. ACF will release evaluation results regardless of the findings. Evaluation reports will describe the methods used, including strengths and weaknesses, and discuss the generalizability of the findings. Evaluation reports will present comprehensive results, including favorable, unfavorable, and null findings. ACF will release evaluation results timely — usually within two months of a report’s completion.

As appropriate and feasible, ACF will archive evaluation data for secondary use by interested researchers, typically through building requirements into contracts to prepare data sets for secondary use. 

Independence

Independence

 

Independence and objectivity are core principles of evaluation. Agency and program leadership, program staff, service providers, populations and communities studied, and others should participate actively in setting evaluation priorities, identifying evaluation questions, and assessing the implications of findings. However, it is important to insulate evaluation functions from undue influence and from both the appearance and the reality of bias. To promote objectivity, ACF protects independence in the design, execution, analysis, and reporting of evaluations. To this end:

ACF will conduct evaluations through the competitive award of grants and contracts to external experts who are free from conflicts of interest.

  • The Deputy Assistant Secretary for Planning, Research, and Evaluation reports directly to the Assistant Secretary for Children and Families; serves as ACF’s Chief Evaluation Officer; has authority to approve the design of evaluation projects and analysis plans; and has authority to approve, release and disseminate evaluation reports.

Ethics

Ethics

ACF-sponsored evaluations will be conducted in an ethical and equitable manner and safeguard the dignity, rights, safety and privacy of participants. ACF-sponsored evaluations will comply with both the spirit and the letter of relevant requirements such as regulations governing research involving human subjects. ACF will expect contractors to meaningfully engage stakeholders from the programs and communities involved in studies to ensure programmatic, cultural, linguistic and historical nuances are accurately and respectfully addressed from the initial study design, through execution, analyses and reporting.