Updating search results...

Reproducibility

Agreement of research results repeated. Reproducibility, replicability, repeatability, robustness, generalizability, organization, documentation, automation, dissemination, guidance, definitions, and more.

183 affiliated resources

Search Resources

View
Selected filters:
Reproducible and transparent research practices in published neurology research
Unrestricted Use
CC BY
Rating
0.0 stars

The objective of this study was to evaluate the nature and extent of reproducible and transparent research practices in neurology publications. Methods The NLM catalog was used to identify MEDLINE-indexed neurology journals. A PubMed search of these journals was conducted to retrieve publications over a 5-year period from 2014 to 2018. A random sample of publications was extracted. Two authors conducted data extraction in a blinded, duplicate fashion using a pilot-tested Google form. This form prompted data extractors to determine whether publications provided access to items such as study materials, raw data, analysis scripts, and protocols. In addition, we determined if the publication was included in a replication study or systematic review, was preregistered, had a conflict of interest declaration, specified funding sources, and was open access. Results Our search identified 223,932 publications meeting the inclusion criteria, from which 400 were randomly sampled. Only 389 articles were accessible, yielding 271 publications with empirical data for analysis. Our results indicate that 9.4% provided access to materials, 9.2% provided access to raw data, 0.7% provided access to the analysis scripts, 0.7% linked the protocol, and 3.7% were preregistered. A third of sampled publications lacked funding or conflict of interest statements. No publications from our sample were included in replication studies, but a fifth were cited in a systematic review or meta-analysis. Conclusions Currently, published neurology research does not consistently provide information needed for reproducibility. The implications of poor research reporting can both affect patient care and increase research waste. Collaborative intervention by authors, peer reviewers, journals, and funding sources is needed to mitigate this problem.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Social Science
Material Type:
Reading
Provider:
Research Integrity and Peer Review
Author:
Austin L. Johnson
Daniel Tritz
Jonathan Pollard
Matt Vassar
Shelby Rauh
Trevor Torgerson
Date Added:
08/07/2020
Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017
Unrestricted Use
CC BY
Rating
0.0 stars

Currently, there is a growing interest in ensuring the transparency and reproducibility of the published scientific literature. According to a previous evaluation of 441 biomedical journals articles published in 2000–2014, the biomedical literature largely lacked transparency in important dimensions. Here, we surveyed a random sample of 149 biomedical articles published between 2015 and 2017 and determined the proportion reporting sources of public and/or private funding and conflicts of interests, sharing protocols and raw data, and undergoing rigorous independent replication and reproducibility checks. We also investigated what can be learned about reproducibility and transparency indicators from open access data provided on PubMed. The majority of the 149 studies disclosed some information regarding funding (103, 69.1% [95% confidence interval, 61.0% to 76.3%]) or conflicts of interest (97, 65.1% [56.8% to 72.6%]). Among the 104 articles with empirical data in which protocols or data sharing would be pertinent, 19 (18.3% [11.6% to 27.3%]) discussed publicly available data; only one (1.0% [0.1% to 6.0%]) included a link to a full study protocol. Among the 97 articles in which replication in studies with different data would be pertinent, there were five replication efforts (5.2% [1.9% to 12.2%]). Although clinical trial identification numbers and funding details were often provided on PubMed, only two of the articles without a full text article in PubMed Central that discussed publicly available data at the full text level also contained information related to data sharing on PubMed; none had a conflicts of interest statement on PubMed. Our evaluation suggests that although there have been improvements over the last few years in certain key indicators of reproducibility and transparency, opportunities exist to improve reproducible research practices across the biomedical literature and to make features related to reproducibility more readily visible in PubMed.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
PLOS Biology
Author:
John P. A. Ioannidis
Joshua D. Wallach
Kevin W. Boyack
Date Added:
08/07/2020
Research practices and statistical reporting quality in 250 economic psychology master's theses: a meta-research investigation
Unrestricted Use
CC BY
Rating
0.0 stars

The replicability of research findings has recently been disputed across multiple scientific disciplines. In constructive reaction, the research culture in psychology is facing fundamental changes, but investigations of research practices that led to these improvements have almost exclusively focused on academic researchers. By contrast, we investigated the statistical reporting quality and selected indicators of questionable research practices (QRPs) in psychology students' master's theses. In a total of 250 theses, we investigated utilization and magnitude of standardized effect sizes, along with statistical power, the consistency and completeness of reported results, and possible indications of p-hacking and further testing. Effect sizes were reported for 36% of focal tests (median r = 0.19), and only a single formal power analysis was reported for sample size determination (median observed power 1 − β = 0.67). Statcheck revealed inconsistent p-values in 18% of cases, while 2% led to decision errors. There were no clear indications of p-hacking or further testing. We discuss our findings in the light of promoting open science standards in teaching and student supervision.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Royal Society Open Science
Author:
Erich Kirchler
Jerome Olsen
Johanna Mosen
Martin Voracek
Date Added:
08/07/2020
Research project initialization and organization following reproducible research guidelines
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Hilmar Lapp
Date Added:
08/07/2020
Resolving the Tension Between Exploration and Confirmation in Preclinical Biomedical Research
Unrestricted Use
CC BY
Rating
0.0 stars

Confirmation through competent replication is a founding principle of modern science. However, biomedical researchers are rewarded for innovation, and not for confirmation, and confirmatory research is often stigmatized as unoriginal and as a consequence faces barriers to publication. As a result, the current biomedical literature is dominated by exploration, which to complicate matters further is often disguised as confirmation. Only recently scientists and the public have begun to realize that high-profile research results in biomedicine can often not be replicated. Consequently, confirmation has become central stage in the quest to safeguard the robustness of research findings. Research which is pushing the boundaries of or challenges what is currently known must necessarily result in a plethora of false positive results. Thus, since discovery, the driving force of scientific progress, is unavoidably linked to high false positive rates and cannot support confirmatory inference, dedicated confirmatory investigation is needed for pivotal results. In this chapter I will argue that the tension between the two modes of research, exploration and confirmation, can be resolved if we conceptually and practically separate them. I will discuss the idiosyncrasies of exploratory and confirmatory studies, with a focus on the specific features of their design, analysis, and interpretation.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
Good Research Practice in Non-Clinical Pharmacology and Biomedicine
Author:
Ulrich Dirnagl
Date Added:
08/07/2020
R for Reproducible Scientific Analysis
Unrestricted Use
CC BY
Rating
0.0 stars

This lesson in part of Software Carpentry workshop and teach novice programmers to write modular code and best practices for using R for data analysis. an introduction to R for non-programmers using gapminder data The goal of this lesson is to teach novice programmers to write modular code and best practices for using R for data analysis. R is commonly used in many scientific disciplines for statistical analysis and its array of third-party packages. We find that many scientists who come to Software Carpentry workshops use R and want to learn more. The emphasis of these materials is to give attendees a strong foundation in the fundamentals of R, and to teach best practices for scientific computing: breaking down analyses into modular units, task automation, and encapsulation. Note that this workshop will focus on teaching the fundamentals of the programming language R, and will not teach statistical analysis. The lesson contains more material than can be taught in a day. The instructor notes page has some suggested lesson plans suitable for a one or half day workshop. A variety of third party packages are used throughout this workshop. These are not necessarily the best, nor are they comprehensive, but they are packages we find useful, and have been chosen primarily for their usability.

Subject:
Applied Science
Computer Science
Information Science
Mathematics
Measurement and Data
Material Type:
Module
Provider:
The Carpentries
Author:
Adam H. Sparks
Ahsan Ali Khoja
Amy Lee
Ana Costa Conrado
Andrew Boughton
Andrew Lonsdale
Andrew MacDonald
Andris Jankevics
Andy Teucher
Antonio Berlanga-Taylor
Ashwin Srinath
Ben Bolker
Bill Mills
Bret Beheim
Clare Sloggett
Daniel
Dave Bridges
David J. Harris
David Mawdsley
Dean Attali
Diego Rabatone Oliveira
Drew Tyre
Elise Morrison
Erin Alison Becker
Fernando Mayer
François Michonneau
Giulio Valentino Dalla Riva
Gordon McDonald
Greg Wilson
Harriet Dashnow
Ido Bar
Jaime Ashander
James Balamuta
James Mickley
Jamie McDevitt-Irwin
Jeffrey Arnold
Jeffrey Oliver
John Blischak
Jonah Duckles
Josh Quan
Julia Piaskowski
Kara Woo
Kate Hertweck
Katherine Koziar
Katrin Leinweber
Kellie Ottoboni
Kevin Weitemier
Kiana Ashley West
Kieran Samuk
Kunal Marwaha
Kyriakos Chatzidimitriou
Lachlan Deer
Lex Nederbragt
Liz Ing-Simmons
Lucy Chang
Luke W Johnston
Luke Zappia
Marc Sze
Marie-Helene Burle
Marieke Frassl
Mark Dunning
Martin John Hadley
Mary Donovan
Matt Clark
Melissa Kardish
Mike Jackson
Murray Cadzow
Narayanan Raghupathy
Naupaka Zimmerman
Nelly Sélem
Nicholas Lesniak
Nicholas Potter
Nima Hejazi
Nora Mitchell
Olivia Rata Burge
Paula Andrea Martinez
Pete Bachant
Phil Bouchet
Philipp Boersch-Supan
Piotr Banaszkiewicz
Raniere Silva
Rayna Michelle Harris
Remi Daigle
Research Bazaar
Richard Barnes
Robert Bagchi
Rémi Emonet
Sam Penrose
Sandra Brosda
Sarah Munro
Sasha Lavrentovich
Scott Allen Funkhouser
Scott Ritchie
Sebastien Renaut
Thea Van Rossum
Timothy Eoin Moore
Timothy Rice
Tobin Magle
Trevor Bekolay
Tyler Crawford Kelly
Vicken Hillis
Yuka Takemon
bippuspm
butterflyskip
waiteb5
Date Added:
03/20/2017
R for Social Scientists
Unrestricted Use
CC BY
Rating
0.0 stars

Data Carpentry lesson part of the Social Sciences curriculum. This lesson teaches how to analyse and visualise data used by social scientists. Data Carpentry’s aim is to teach researchers basic concepts, skills, and tools for working with data so that they can get more done in less time, and with less pain. The lessons below were designed for those interested in working with social sciences data in R. This is an introduction to R designed for participants with no programming experience. These lessons can be taught in a day (~ 6 hours). They start with some basic information about R syntax, the RStudio interface, and move through how to import CSV files, the structure of data frames, how to deal with factors, how to add/remove rows and columns, how to calculate summary statistics from a data frame, and a brief introduction to plotting.

Subject:
Applied Science
Information Science
Mathematics
Measurement and Data
Social Science
Material Type:
Module
Provider:
The Carpentries
Author:
Angela Li
Ben Marwick
Christina Maimone
Danielle Quinn
Erin Alison Becker
Francois Michonneau
Geoffrey LaFlair
Hao Ye
Jake Kaupp
Juan Fung
Katrin Leinweber
Martin Olmos
Murray Cadzow
Date Added:
08/07/2020
Rigor Champions and Resources
Unrestricted Use
Public Domain
Rating
0.0 stars

Efforts to Instill the Fundamental Principles of Rigorous ResearchRigorous experimental procedures and transparent reporting of research results are vital to the continued success of the biomedical enterprise at both the preclinical and the clinical levels; therefore, NINDS convened major stakeholders in October 2018 to discuss how best to encourage rigorous biomedical research practices. The attendees discussed potential improvements to current training resources meant to instill the principles of rigorous research in current and future scientists, ideal attributes of a potential new educational resource, and cultural factors needed to ensure the success of such training. Please see the event website for more information about this workshop, including video recordings of the discussion, or the recent publication summarizing the workshop.Rigor ChampionsAs described in this publication, enthusiastic individuals ("champions") who want to drive improvements in rigorous research practices, transparent reporting, and comprehensive education may come from all career stages and sectors, including undergraduate students, graduate students, postdoctoral fellows, researchers, educators, institutional leaders, journal editors, scientific societies, private industry, and funders. We encouraged champions to organize themselves into intra- and inter-institutional communities to effect change within and across scientific institutions. These communities can then share resources and best practices, propose changes to current training and research infrastructure, build new tools to support better research practices, and support rigorous research on a daily basis.If you are interested learning more, you can join this grassroots online workspace or email us at RigorChampions@nih.gov.Rigor ResourcesIn order to understand the current landscape of training in the principles of rigorous research, NINDS is gathering a list of public resources that are, or can be made, freely accessible to the scientific community and beyond. We hope that compiling these resources will help identify gaps in training and stimulate discussion about proposed improvements and the building of new resources that facilitate training in transparency and other rigorous research practices. Please peruse the resources compiled thus far below, and contact us at RigorChampions@nih.gov to let us know about other potential resources.NINDS does not endorse any of these resources and leaves it to the scientific community to judge their quality.Resources TableCategories of resources listed in the table include Books and Articles, Guidelines and Protocols, Organizations and Training Programs, Software and Other Digital Resources, and Videos and Courses.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
National Institutes of Health
Author:
National Institutes of Health
Date Added:
08/07/2020
Rigor and Reproducibility | grants.nih.gov
Read the Fine Print
Rating
0.0 stars

The information provided on this website is designed to assist the extramural community in addressing rigor and transparency in NIH grant applications and progress reports. Scientific rigor and transparency in conducting biomedical research is key to the successful application of knowledge toward improving health outcomes.

Definition
Scientific rigor is the strict application of the scientific method to ensure unbiased and well-controlled experimental design, methodology, analysis, interpretation and reporting of results.

Goals
The NIH strives to exemplify and promote the highest level of scientific integrity, public accountability, and social responsibility in the conduct of science. Grant applications instructions and the criteria by which reviewers are asked to evaluate the scientific merit of the application are intended to:

• ensure that NIH is funding the best and most rigorous science,
• highlight the need for applicants to describe details that may have been previously overlooked,
• highlight the need for reviewers to consider such details in their reviews through updated review language, and
• minimize additional burden.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Author:
NIH
Date Added:
08/07/2020
R para Análisis Científicos Reproducibles
Unrestricted Use
CC BY
Rating
0.0 stars

Una introducción a R utilizando los datos de Gapminder. El objetivo de esta lección es enseñar a las programadoras principiantes a escribir códigos modulares y adoptar buenas prácticas en el uso de R para el análisis de datos. R nos provee un conjunto de paquetes desarrollados por terceros que se usan comúnmente en diversas disciplinas científicas para el análisis estadístico. Encontramos que muchos científicos que asisten a los talleres de Software Carpentry utilizan R y quieren aprender más. Nuestros materiales son relevantes ya que proporcionan a los asistentes una base sólida en los fundamentos de R y enseñan las mejores prácticas del cómputo científico: desglose del análisis en módulos, automatización tareas y encapsulamiento. Ten en cuenta que este taller se enfoca en los fundamentos del lenguaje de programación R y no en el análisis estadístico. A lo largo de este taller se utilizan una variedad de paquetes desarrolados por terceros, los cuales no son necesariamente los mejores ni se encuentran explicadas todas sus funcionalidades, pero son paquetes que consideramos útiles y han sido elegidos principalmente por su facilidad de uso.

Subject:
Applied Science
Computer Science
Information Science
Mathematics
Measurement and Data
Material Type:
Module
Provider:
The Carpentries
Author:
A. s
Alejandra Gonzalez-Beltran
Ana Beatriz Villaseñor Altamirano
Antonio
AntonioJBT
Belinda Weaver
Claudia Engel
Cynthia Monastirsky
Daniel Beiter
David Mawdsley
David Pérez-Suárez
Erin Becker
EuniceML
François Michonneau
Gordon McDonald
Guillermina Actis
Guillermo Movia
Hely Salgado
Ido Bar
Ivan Ogasawara
Ivonne Lujano
James J Balamuta
Jamie McDevitt-Irwin
Jeff Oliver
Jonah Duckles
Juan M. Barrios
Katrin Leinweber
Kevin Alquicira
Kevin Martínez-Folgar
Laura Angelone
Laura-Gomez
Leticia Vega
Marcela Alfaro Córdoba
Marceline Abadeer
Maria Florencia D'Andrea
Marie-Helene Burle
Marieke Frassl
Matias Andina
Murray Cadzow
Narayanan Raghupathy
Naupaka Zimmerman
Paola Prieto
Paula Andrea Martinez
Raniere Silva
Rayna M Harris
Richard Barnes
Richard McCosh
Romualdo Zayas-Lagunas
Sandra Brosda
Sasha Lavrentovich
Shirley Alquicira Hernandez
Silvana Pereyra
Tobin Magle
Veronica Jimenez
juli arancio
raynamharris
saynomoregrl
Date Added:
08/07/2020
Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability
Unrestricted Use
CC BY
Rating
0.0 stars

An academic scientist’s professional success depends on publishing. Publishing norms emphasize novel, positive results. As such, disciplinary incentives encourage design, analysis, and reporting decisions that elicit positive results and ignore negative results. Prior reports demonstrate how these incentives inflate the rate of false effects in published science. When incentives favor novelty over replication, false results persist in the literature unchallenged, reducing efficiency in knowledge accumulation. Previous suggestions to address this problem are unlikely to be effective. For example, a journal of negative results publishes otherwise unpublishable reports. This enshrines the low status of the journal and its content. The persistence of false findings can be meliorated with strategies that make the fundamental but abstract accuracy motive—getting it right—competitive with the more tangible and concrete incentive—getting it published. This article develops strategies for improving scientific practices and knowledge accumulation that account for ordinary human motivations and biases.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Perspectives on Psychological Science
Author:
Brian A. Nosek
Jeffrey R. Spies
Matt Motyl
Date Added:
08/07/2020
A Social Psychological Model of Scientific Practices: Explaining Research Practices and Outlining the Potential for Successful Reforms
Unrestricted Use
CC BY
Rating
0.0 stars

A crescendo of incidents have raised concerns about whether scientific practices in psychology may be suboptimal, sometimes leading to the publication, dissemination, and application of unreliable or misinterpreted findings. Psychology has been a leader in identifying possibly suboptimal practices and proposing reforms that might enhance the efficiency of the scientific process and the publication of robust evidence and interpretations. To help shape future efforts, this paper offers a model of the psychological and socio-structural forces and processes that may influence scientists’ practices. The model identifies practices targeted by interventions and reforms, and which practices remain unaddressed. The model also suggests directions for empirical research to assess how best to enhance the effectiveness of psychological inquiry.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Psychologica Belgica
Author:
Jon A. Krosnick
Lee Jussim
Sean T. Stevens
Stephanie M. Anglin
Date Added:
08/07/2020
Social Science Workshop Overview
Unrestricted Use
CC BY
Rating
0.0 stars

Workshop overview for the Data Carpentry Social Sciences curriculum. Data Carpentry’s aim is to teach researchers basic concepts, skills, and tools for working with data so that they can get more done in less time, and with less pain. This workshop teaches data management and analysis for social science research including best practices for data organization in spreadsheets, reproducible data cleaning with OpenRefine, and data analysis and visualization in R. This curriculum is designed to be taught over two full days of instruction. Materials for teaching data analysis and visualization in Python and extraction of information from relational databases using SQL are in development. Interested in teaching these materials? We have an onboarding video and accompanying slides available to prepare Instructors to teach these lessons. After watching this video, please contact team@carpentries.org so that we can record your status as an onboarded Instructor. Instructors who have completed onboarding will be given priority status for teaching at centrally-organized Data Carpentry Social Sciences workshops.

Subject:
Applied Science
Information Science
Mathematics
Measurement and Data
Social Science
Material Type:
Module
Provider:
The Carpentries
Author:
Angela Li
Erin Alison Becker
Francois Michonneau
Maneesha Sane
Sarah Brown
Tracy Teal
Date Added:
08/07/2020
Software Carpentry
Unrestricted Use
CC BY
Rating
0.0 stars

Since 1998, Software Carpentry has been teaching researchers the computing skills they need to get more done in less time and with less pain. Our volunteer instructors have run hundreds of events for more than 34,000 researchers since 2012. All of our lesson materials are freely reusable under the Creative Commons - Attribution license.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Full Course
Provider:
Software Carpentry Community
Author:
Software Carpentry Community
Date Added:
06/18/2020
Statcheck
Read the Fine Print
Some Rights Reserved
Rating
0.0 stars

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
Statcheck
Author:
Michele B. Nuijten
Sacha Epskamp
Date Added:
08/07/2020
Statistics with JASP and the Open Science Framework
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar will introduce the integration of JASP Statistical Software (https://jasp-stats.org/) with the Open Science Framework (OSF; https://osf.io). The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github, Mendeley, and now is integrated with JASP, to streamline workflows and increase efficiency.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Ten Simple Rules for Reproducible Computational Research
Unrestricted Use
CC BY
Rating
0.0 stars

Replication is the cornerstone of a cumulative science. However, new tools and technologies, massive amounts of data, interdisciplinary approaches, and the complexity of the questions being asked are complicating replication efforts, as are increased pressures on scientists to advance their research. As full replication of studies on independently collected data is often not feasible, there has recently been a call for reproducible research as an attainable minimum standard for assessing the value of scientific claims. This requires that papers in experimental science describe the results and provide a sufficiently clear protocol to allow successful repetition and extension of analyses based on original data. The importance of replication and reproducibility has recently been exemplified through studies showing that scientific papers commonly leave out experimental details essential for reproduction, studies showing difficulties with replicating published experimental results, an increase in retracted papers, and through a high number of failing clinical trials. This has led to discussions on how individual researchers, institutions, funding bodies, and journals can establish routines that increase transparency and reproducibility. In order to foster such aspects, it has been suggested that the scientific community needs to develop a “culture of reproducibility” for computational science, and to require it for published claims. We want to emphasize that reproducibility is not only a moral responsibility with respect to the scientific field, but that a lack of reproducibility can also be a burden for you as an individual researcher. As an example, a good practice of reproducibility is necessary in order to allow previously developed methodology to be effectively applied on new data, or to allow reuse of code and results for new projects. In other words, good habits of reproducibility may actually turn out to be a time-saver in the longer run. We further note that reproducibility is just as much about the habits that ensure reproducible research as the technologies that can make these processes efficient and realistic. Each of the following ten rules captures a specific aspect of reproducibility, and discusses what is needed in terms of information handling and tracking of procedures. If you are taking a bare-bones approach to bioinformatics analysis, i.e., running various custom scripts from the command line, you will probably need to handle each rule explicitly. If you are instead performing your analyses through an integrated framework (such as GenePattern, Galaxy, LONI pipeline, or Taverna), the system may already provide full or partial support for most of the rules. What is needed on your part is then merely the knowledge of how to exploit these existing possibilities.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Reading
Provider:
PLOS Computational Biology
Author:
Anton Nekrutenko
Eivind Hovig
Geir Kjetil Sandve
James Taylor
Date Added:
08/07/2020
Tools for Reproducible Research
Read the Fine Print
Rating
0.0 stars

Course summary
A minimal standard for data analysis and other scientific computations is that they be reproducible: that the code and data are assembled in a way so that another group can re-create all of the results (e.g., the figures in a paper). The importance of such reproducibility is now widely recognized, but it is still not so widely practiced as it should be, in large part because many computational scientists (and particularly statisticians) have not fully adopted the required tools for reproducible research.

In this course, we will discuss general principles for reproducible research but will focus primarily on the use of relevant tools (particularly make, git, and knitr), with the goal that the students leave the course ready and willing to ensure that all aspects of their computational research (software, data analyses, papers, presentations, posters) are reproducible.

Subject:
Applied Science
Information Science
Material Type:
Full Course
Author:
Karl Broman
Date Added:
08/07/2020
Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals
Unrestricted Use
CC BY
Rating
0.0 stars

Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Peixuan Guo
Victoria Stodden
Zhaokun Ma
Date Added:
08/07/2020