image Visit coronavirus.govVisit disclaimer page for the latest Coronavirus Disease (COVID-19) updates.
View ACF COVID-19 Responses and Resources

Methods for Promoting Open Science in Social Policy Research

April 13, 2020
Topics:
Methods and Tools
Projects:
Methods Inquiries, 2013-2022 | Learn more about this project
Types:
Reports
This is the cover for Methods for Promoting Open Science in Social Policy Research
Download report (pdf)
  • File Size 1mb
  • Pages 8
  • Published 2020

Introduction

“Open science” represents a broad movement to make all phases of research—from design to dissemination—more transparent and accessible. The scientific community and Federal agencies that support research have a growing interest in open science methods in response to highly publicized news stories and journal articles that cast doubt on research credibility.1 These articles highlighted issues such as data manipulation (e.g., p-hacking), publication bias (e.g., no publication of null results), inability to replicate or reproduce research results,2 and other individual and system-level practices.3 Proponents of open science strive to transform the research ecosystem through a range of methods that encourage open sharing of research information and enable researchers to verify and build on each other’s work.4


1 For example, see Hardwicke, T. E., & Ioannidis, J. P. (2018). Populating the Data Ark: An attempt to retrieve, preserve, and liberate data from the most highly-cited psychology and psychiatry articles. PloS One, 13(8), e0201856 and John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532.

2 Winerman, L. (2017). Trends report: Psychologists embrace open science. American Psychological Association, 48, 90. Retrieved from https://www.apa.org/monitor/2017/11/trends-open-science

3 Ibid.

4 National Academies of Sciences, Engineering, and Medicine. (2018). Open science by design: Realizing a vision for 21st century research. Washington, DC: Board on Research Data and Information, Policy, and Global Affairs. Retrieved from http://sites.nationalacademies.org/pga/brdi/open_science_enterprise/

Purpose

On October 24, 2019, the Administration for Children and Families’ Office of Planning, Research, and Evaluation (OPRE) convened a meeting for participants from Federal agencies, research firms, academia, and other organizations to discuss open science topics.

This summary summarizes key themes from the meeting, including the following:

  • What can the research community do to build a “self-correcting” culture?
  • Why and how do researchers implement pre-registration practices?
  • Why and how should researchers promote reproducibility?
  • How can researchers better synthesize evidence and build the social policy evidence base?
  • What are important considerations when implementing open science practices in Federal contexts?

Key Findings and Highlights

  • Open science seeks to address recent concerns about research credibility by making all phases of the research process more transparent and accessible
  • As a major funder of scientific research, the Federal government has an interest in open science practices – particularly following the passage of the Foundations for Evidence-Based Policymaking Act of 2018.
  • The research community can help address questionable research practices and build a “self-correcting” culture by undertaking efforts to promote open science and implement quality control mechanisms. Strategies might include:
    • Providing other researchers with access to data, code, materials, and notebooks to review the work;
    • Pre-registering analysis plans (e.g., planned sample size, manipulations, measures, analytic strategies, critical hypothesis tests) before conducting analyses to enable research consumers to calibrate the results themselves;
    • Providing open access to journals, preprints, and the peer review process to help readers from all backgrounds better evaluate the quality of research and detect errors;
    • Declaring conflicts of interest.
  • Researchers can implement pre-registration, or the process of registering study plans in a repository before research begins, in several different ways (e.g., on an internal or nonpublic system, on a simple online platform, or in a full-fledged registry). Reasons for doing this include:
    • To constrain flexibility by requesting data collection and analysis decisions are made ahead of time;
    • To increase transparency and rigor by identifying discrepancies between planned and actual study decisions;
    • To increase the likelihood of peer review; and
    • To prevent researchers from falsifying their research to support a specific hypothesis
  • Researchers should promote reproducibility, or the process of recreating research results using the same data and code, to help build scientific knowledge and credibility.
    • Although data sharing is a critical part of reproducibility, some stakeholders may resist providing access to sensitive information.  However, there are ways to release data while protecting confidential details.
  • Open science practices can help researchers better synthesize evidence and build the social policy evidence base. For example:
    • Open science can facilitate replication, or the process of generating the same results as previous studies using the same procedures but different data.
    • Open science can facilitate meta-analysis, or the statistical or quantitative synthesis of numerical findings from two or more studies.         
  • Implementing open science practices in Federal contexts can require particular care and sensitivity, but helps the government meet its obligation of efficiently using taxpayer dollars.

Citation

Holzwart, R., & Wagner, H. (2020). Methods for Promoting Open Science in Social Policy Research (OPRE Report 2020-24). Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Last Reviewed: August 17, 2020