Skip to main content
SearchLoginLogin or Signup

Why and How We Share Reproducible Research at Yale University’s Institution for Social and Policy Studies

Published onJan 31, 2024
Why and How We Share Reproducible Research at Yale University’s Institution for Social and Policy Studies
·

Column Editor’s Note: A recurring theme in the space of reproducibility and replicability is: who should verify completeness, robustness, and correctness of artifacts in support of computational reproducibility, beyond the regular to and fro in the scientific literature? I (Lars Vilhuber, the Column Editor of this Reinforcing Reproducibility and Replicability column) also happen to be in charge of reproducibility verifications at the journals of a disciplinary society, and journals and societies are often seen as key players in this space (see Kim Weeden’s recent article on this in sociology). I am therefore always intrigued when reproducibility verification emerges organically within other institutions. In the present piece, Limor Peer describes here the history and current status of verification efforts at the Institution for Social and Policy Studies at Yale, the motivating vision and how it emerged, and what they currently do to support researchers in improving the reproducibility of their computational work. Of particular interest is the deep and exemplary university-wide support for this work. The corollary of such work, as Limor points out, are the new competencies that researchers, but also university research support specialists of all types, must develop, and continue developing. Limor’s work complements two articles in the original launch edition of this column, by Graham MacDonald (writing about open data efforts at the Urban Institute) and Courtney Butler (writing about a similar process as Limor’s at the Federal Reserve Bank of Kansas City). I will continue to highlight such efforts in future columns. 

Keywords: reproducibility, computational reproducibility, data sharing, code sharing, social science


Introduction

In the United States, as elsewhere, calls for public access to scientific research, including data and code, are intensifying (e.g., Holdren, 2013; NASEM, 2019; Nelson, 2022). These calls are motivated by the goal of getting more value out of public investment in science, achieved by making research more reproducible and transparent. Open research is understood as better science and is increasingly expected.

Research institutions have an interest in verifying the reproducibility of their research before it is published. This is because of research institutions’ commitment to responsible research and the need to build competencies and capacity around open research in light of a changing research culture. I offer this perspective as the manager of a data archive at Yale University’s Institution for Social and Policy Studies (ISPS), where we have been verifying and sharing reproducible research for more than 10 years.

About the ISPS Data Archive

The ISPS was founded in 1968 as an interdisciplinary center to support social science and public policy research at Yale University. Since the year 2000, ISPS has pioneered and established itself as an important center for the implementation of randomized controlled trials, or field experiments, especially in political science, a discipline in which this method was quite uncommon at the time. In 2011, ISPS launched a database it created for sharing and archiving the accumulated body of knowledge from these experiments and other data-driven computational social science research. ISPS invites affiliated researchers1 to deposit data, computer code, and associated materials used to produce original research (i.e., a ‘replication package’) in what is known as the ISPS Data Archive (Peer & Green, 2012).

Since those days in the early 2010s, dramatic developments in the scholarly landscape have taken place around research data management and sharing—in terms of infrastructure, tools, and guidelines. Researchers now have more choices where to deposit and share their data and code, and they are increasingly required or encouraged to do so by journals and funders. The emergence and broad acceptance of the FAIR principles (Wilkinson et al., 2016) in the last decade propelled all stakeholders to work toward implementing better practices (e.g., data citation) and following standards (e.g., use of Persistent ID, or PID). However, there is less agreement on what other standards to prioritize—for example, independent understandability, long-term reusability, reproducibility—or who bears responsibility to uphold them. ISPS set out to take a broad view that acknowledges additional relevant standards under the banner of open research.

Vision and Standards

The ISPS vision is to enable continued, long-term access and independent reuse of the replication package and to ensure that the quality of the objects meets community standards for FAIR and for long-term archival preservation. From the archive’s early days, ISPS aimed to meet a responsibility for assisting researchers who wish to disseminate and archive research products that support published scientific claims. ISPS’s assistance includes a review of the replication package to confirm that the materials indeed support the reported findings.

ISPS believes that it has both responsibility and expertise to assist researchers who wish to disseminate and archive research products that support published scientific claims and has created a process to ensure computational reproducibility. (https://isps.yale.edu/research/data/approach)

This orientation to responsible data and code sharing stems from strongly held principles. First, a set of values related to research ethics: rigor, transparency, and integrity. This set of values stipulates that standards-based research practices must be followed in every aspect of the research: not only gathering and analyzing data but also in dissemination and archiving of the replication package. Second, the values of stewardship, curation, and preservation, which represent a commitment to the longevity of research materials. Third, responsible data and code sharing also calls for additional standards: that materials are usable, reproducible, and independently understandable (while in compliance with all legal and ethical limitations). These principles confer credibility to the research and align with a scientific ethos that elevates the ability to extend and build upon previous findings.

Framework and Workflow

ISPS built a process to review materials underlying research claims and ensure the computational reproducibility of replication packages (see https://isps.yale.edu/research/data/approach). The process is based on the Data Quality Review (DQR) framework, which prescribes actions to curate data and code and to review the computational reproducibility of the materials. In addition, ISPS developed software to facilitate the curation and review workflow, the Yale Application for Research Data (YARD) (Peer & Dull, 2020).

Eligible empirical studies are supported by ISPS and include original quantitative data. As part of the DQR, the data review includes metadata enhancement and disclosure risk assessment. ISPS currently prioritizes open data but also conducts reviews for replication packages that include restricted data, in which case the published replication package is incomplete (following the motto, ‘as open as possible and as restricted as necessary’). The code review involves a computational reproducibility check to verify that the code executes and to compare the code output with the numerical results reported in the manuscript (for more details, see Peer et al., 2014). Replication packages are published when ISPS verifies computational reproducibility. In cases where full verification is not achievable or feasible, ISPS publishes a curator README document as part of the replication package (see Ottone & Peer, 2023).

The ISPS internal review establishes a collaborative environment in which the archive team supports researchers with expertise on preparing and sharing reproducible research. This is a service ISPS provides along with access to research infrastructure and guidance on various aspects of open research to its affiliates.

A Well-Tended Garden

ISPS chose to strongly recommend but not require its affiliated researchers to engage with the review process or deposit in the ISPS Data Archive prior to sharing data and code. Instead, ISPS focuses on the practical benefit that responsible data and code sharing provides researchers: It offers an opportunity for researchers to have professionals review materials before they are shared with the scientific community.

ISPS implements its internal review function by means of a ‘push’ or a ‘pull.’ Researchers can request a review prior to submission to a journal or during the journal review process (‘push’). Alternatively, and most commonly, ISPS will request a replication package from the researcher or otherwise obtain copies of a replication package made available elsewhere to perform the review (‘pull’).2 ISPS will typically initiate pull requests when it is informed of the pending or actual publication of a manuscript or replication package.3 In all cases, ISPS’s review will result in the deposit of the replication package in the ISPS Data Archive and its publication on the ISPS website. In so doing, we aim to establish internal review as a matter of course.

This is a small-scale operation, a modest collection for a designated community: As of December 2023, the archive holds a collection of about 130 verified replication packages. For perspective, between 2013 and 2023, 432 articles were published in the ISPS publication database (see https://isps.yale.edu/research/publications), most of them eligible for internal review. The review in its early years focused on a backlog of previously published field experiments and continued in a consistent, if limited, capacity. During this time frame, we have seen some evidence of a move toward push by researchers seeking pre-submission review. For example, there were three requests for review in 2023, three in 2022, and one in 2021 (for reference, a total of 45, 44, and 46 articles, respectively, were published in the ISPS publication database). Other changes in upstream behaviors are detected in better documentation of data (provision of a README file or a codebook), cleaner code, and overall better organization of the replication package (Peer, 2022).

The curation team benefits from proximity to researchers, communicating with them as a trusted partner on any issues that surface during the review. While ISPS has not systematically measured these changes, we internally document anecdotal evidence and, based on informal conversations and observations, have attributed behavior change not only to researchers’ familiarity with ISPS procedures and positive experience with the review process, but also to greater appreciation of this practice in light of an emergent open research culture.

Broader Institutional Support

In addition to commitment of ISPS resources, the archive benefitted from the support of university partners, including the Office of Data and Assent Infrastructure (ODAI), which sponsored the initial ISPS pilot; the Yale University Library, which consults on solutions and service around data management, curation, and preservation; Yale University Information Technology, which took on support of the YARD technology; and the recently established Data-Intensive Social Science Center, which is cosponsoring the archive as part of its research support services portfolio.

Research Institutions in the Age of Open Research

This section poses the general question, ‘what is required of research institutions in the age of open research?’

Specifically with respect to reproducible research, and abstracting from the experience of the ISPS Data Archive described in the previous section, my position is that research institutions have a strong interest in verifying the reproducibility of replication packages before they are published, whether they themselves choose to publish replication packages or not. There are two primary reasons for this, as described below: Adherence to a basic principle of science and fulfilment of its function as an educational enterprise.

Science Invites Scrutiny

Research institutions have an ethical obligation to produce research that adheres to the highest standards at every stage of the research lifecycle. Scientific values of rigor, transparency, and integrity are aligned with open research indicating that data and code are expected to withstand scrutiny. Institutions committed to responsible research—whether entire universities or centers or labs within—must also commit to the development of socio-technical infrastructure in support of these values. Academic institutions tend to focus on responsible research early in the research lifecycle, for example, via institutional review boards. But later stages of the research lifecycle also require a level of internal review, especially in the age of open research, as an expression of commitment to the values of stewardship and preservation.

Such review, including reproducibility verification, introduces a bit of useful friction, which can reduce cost in the system overall (Frith, 2020; Pownall & Hoerst, 2022). For example, by minimizing minor, but potentially costly, annoyances to others attempting to reuse research outputs.

No one is immune from making mistakes. In research, mistakes might include analyzing raw data instead of cleaned data, reversing variable labels, transcribing information incorrectly, or inadvertently saving over a file. The consequences of these kinds of mistakes can range from minor annoyances like wasted time and resources to major issues such as retraction of an article. (Strand, 2023)

New Competencies in Support of Culture Change

Research institutions can be a locus of open research capacity building by developing new competencies in support of it. They are well positioned to provide such competencies to the researchers they employ and to create a pathway for collaborative and inclusive environments in which faculty researchers work alongside professionals with complementary skills. Academic institutions are a critical part of the research ecosystem and can be catalysts for culture change around open research.

Whether acting out of a sense of mission, self-interest, or compliance, academic institutions would do well to develop competencies around open research. Robust open research capacity has several benefits for academic institutions: Supporting institutions’ mission to advance knowledge for the public good, building institutional memory for research projects (Nolan, 2023), bridging the chasm between e-infrastructure providers and scientific domain specialists (Ayris et al., 2016), and helping institutions attract and retain top faculty and researchers.

Technology alone cannot meet the challenge of open research. Open research capacity can be achieved in-house by creating services in this area (e.g., data curation, archiving, reproducibility verification), by providing discipline-specific research infrastructure (e.g., computer clusters, large data storage), by nurturing a workforce of research support specialists (e.g., research software engineers, research scientists, research managers, data analysts), and by training researchers on various aspects of open research (e.g., open-source software, coding skills, version control) to establish habits.

Teaching students to work reproducibly enables easier and deeper evaluation of their work; having them reproduce parts of analyses by others allows them to learn skills like exploratory data analysis that are commonly practiced but not yet systematically taught; and training them to work reproducibly will make their post-graduation work more reliable. (Donoho, 2017)

Conclusion

Open research will be achieved with a concerted effort by a number of different actors, researchers, publishers, and funders among them. For example, publishers have made advances in this area and are increasingly subjecting replication packages to review and verification. However, as the National Academies of Science, Engineering, and Medicine points out, “this process has not been adopted at most journals because it requires a major commitment of resources” (NASEM, 2023, p. 190).

Research institutions are especially well-positioned to bridge gaps—or act as a pressure point—between these actors. Institutions can lead and shape culture change by helping establish open research as routine practice, as in the case of the ISPS review. They can provide infrastructure for mentoring early career researchers on adopting open science practices, grant professional recognition for sharing data, preprints, and other research outputs, and develop organizational workflows that incentivize open by default. Given their inherent strengths—a scientific tradition of inviting scrutiny and the capacity to develop organizational competencies—institutions can play an important role.

The ISPS internal review is one model. Institutions might choose to build capacity for open research in-house, outsource some aspects (e.g., third-party reproducibility checks), join forces regionally on key areas, or some mix of strategies depending on resources, scale, and other considerations. In the case of ISPS, a commitment to open and reproducible research generated processes, workflows, and practices for verifying reproducibility. We believe that such commitment underscores the credibility and rigor of the research and ultimately helps others reap the benefits of the research.


Acknowledgments

The author gratefully acknowledges the helpful comments from the editors and three anonymous reviewers and the valuable feedback on previous drafts from participants of the CRRESS webinar series and members of the ISPS team. The views expressed herein are those of the author and do not necessarily represent the views of ISPS or Yale University.

Disclosure Statement

Peer’s original contribution grew out of an oral presentation in Session 7 of the CRRESS series. You can find the presentation on Youtube (archived as MacDonald et al., 2023). An earlier version of this paper was published in https://labordynamicsinstitute.github.io/crress-book/sessions/7/peer/.

The Conference on Reproducibility and Replicability in Economics and the Social Sciences (CRRESS) is funded by National Science Foundation Grant #2217493.


References

Ayris, P., Berthou, J-Y., Bruce, R., Lindstaedt, S., Monreale, A., Mons, B., Murayama, Y., Södergård, C., Tochtermann, K., & Wilkinson, R. (2016). Realising the European Open Science Cloud. European Union. https://doi.org/10.2777/940154

Donoho, D. (2017). 50 years of data science. Journal of Computational and Graphical Statistics, 26(4), 745–766. https://doi.org/10.1080/10618600.2017.1384734

Frith, U. (2020). Fast lane to slow science. Trends in Cognitive Sciences, 24(1), 1–2. https://doi.org/10.1016/j.tics.2019.10.007

Holdren, J. (2013). Memorandum for the heads of executive departments and agencies: Increasing access to the results of federally funded scientific research. White House Office of Science and Technology Policy (OSTP). https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/ostp_public_access_memo_2013.pdf

MacDonald, G., Peer, L., Butler, C., & Michuda, A. (2023). Why can or should research institutions publish replication packages? CRRESS, Conference on Reproducibility and Replicability in Economics and the Social Sciences. Labor Dynamics Institute. https://doi.org/10.7298/pntg-rw59.

National Academies of Sciences, Engineering, and Medicine. (2019). Reproducibility and replicability in science. National Academies Press. https://doi.org/10.17226/25303

National Academies of Sciences, Engineering, and Medicine. (2023). Behavioral economics: Policy impact and future directions. National Academies Press. https://doi.org/10.17226/26874

Nelson, A. (2022). Memorandum for the heads of executive departments and agencies: Ensuring Free, immediate, and equitable access to federally funded research. White House Office of Science and Technology Policy (OSTP). https://www.whitehouse.gov/wp-content/uploads/2022/08/08-2022-OSTP-Public-Access-Memo.pdf

Nolan, R. (2023, February 21). Building institutional memory for research projects – Why education is key to long-term change. The LSE Impact Blog. https://blogs.lse.ac.uk/impactofsocialsciences/2023/02/21/building-institutional-memory-for-research-projects-why-education-is-key-to-long-term-change/

Ottone, N. & Peer, L. (2023). Unintended research code errors and computational reproducibility [Manuscript submitted for publication].

Peer, L. (2022). Ten years of sharing reproducible research. ResearchDataQ. https://researchdataq.org/editorials/ten-years-of-sharing-reproducible-research/

Peer, L., & Dull, J. (2020). YARD: A tool for curating research outputs. Data Science Journal, 19(28), 1–11. https://doi.org/10.5334/dsj-2020-028

Peer, L., & Green, A. (2012). Building an open data repository for a specialized research community: Process, challenges and lessons. International Journal of Digital Curation, 7(1), 51–62.

Peer, L., Green, A., & Stephenson, E. (2014). Committing to data quality review. International Journal of Digital Curation, 9(1). http://doi.org/10.2218/ijdc.v9i1.317

Pownall, M., & Hoerst, C. (2022, January 11). Slow science in scholarly critique [Letter]. The Psychologist. The British Psychological Society. https://www.bps.org.uk/psychologist/slow-science-scholarly-critique

Strand, J. (2023). Error tight: Exercises for lab groups to prevent research mistakes. Psychological Methods. Advance online publication. https://doi.org/10.1037/met0000547

Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., … Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3, Article 160018. https://doi.org/10.1038/sdata.2016.18


©2024 Limor Peer. This article is licensed under a Creative Commons Attribution (CC BY 4.0) International license, except where otherwise indicated with respect to particular material included in the article.

Comments
0
comment
No comments here
Why not start the discussion?