Updating search results...

Policy

Open scholarship policy examples, guidance, and research. Institutional policy, funder policy, publisher policy, governmental policy, tenure policies, and more.

24 affiliated resources

Search Resources

View
Selected filters:
7 Easy Steps to Open Science: An Annotated Reading List
Unrestricted Use
CC BY
Rating
0.0 stars

The Open Science movement is rapidly changing the scientific landscape. Because exact definitions are often lacking and reforms are constantly evolving, accessible guides to open science are needed. This paper provides an introduction to open science and related reforms in the form of an annotated reading list of seven peer-reviewed articles, following the format of Etz et al. (2018). Written for researchers and students - particularly in psychological science - it highlights and introduces seven topics: understanding open science; open access; open data, materials, and code; reproducible analyses; preregistration and registered reports; replication research; and teaching open science. For each topic, we provide a detailed summary of one particularly informative and actionable article and suggest several further resources. Supporting a broader understanding of open science issues, this overview should enable researchers to engage with, improve, and implement current open, transparent, reproducible, replicable, and cumulative scientific practices.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Author:
Alexander Etz
Amy Orben
Hannah Moshontz
Jesse Niebaum
Johnny van Doorn
Matthew Makel
Michael Schulte-Mecklenbeck
Sam Parsons
Sophia Crüwell
Date Added:
08/12/2019
Awesome Open Science Resources
Unrestricted Use
CC BY
Rating
0.0 stars

Scientific data and tools should, as much as possible, be free as in beer and free as in freedom. The vast majority of science today is paid for by taxpayer-funded grants; at the same time, the incredible successes of science are strong evidence for the benefit of collaboration in knowledgable pursuits. Within the scientific academy, sharing of expertise, data, tools, etc. is prolific, but only recently with the rise of the Open Access movement has this sharing come to embrace the public. Even though most research data is never shared, both the public and even scientists in their own fields are often unaware of just much data, tools, and other resources are made freely available for analysis! This list is a small attempt at bringing light to data repositories and computational science tools that are often siloed according to each scientific discipline, in the hopes of spurring along both public and professional contributions to science.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Author:
Austin Soplata
Date Added:
09/23/2018
Data availability, reusability, and analytic reproducibility: evaluating the impact of a mandatory open data policy at the journal Cognition
Unrestricted Use
CC BY
Rating
0.0 stars

Access to data is a critical feature of an efficient, progressive and ultimately self-correcting scientific ecosystem. But the extent to which in-principle benefits of data sharing are realized in practice is unclear. Crucially, it is largely unknown whether published findings can be reproduced by repeating reported analyses upon shared data (‘analytic reproducibility’). To investigate this, we conducted an observational evaluation of a mandatory open data policy introduced at the journal Cognition. Interrupted time-series analyses indicated a substantial post-policy increase in data available statements (104/417, 25% pre-policy to 136/174, 78% post-policy), although not all data appeared reusable (23/104, 22% pre-policy to 85/136, 62%, post-policy). For 35 of the articles determined to have reusable data, we attempted to reproduce 1324 target values. Ultimately, 64 values could not be reproduced within a 10% margin of error. For 22 articles all target values were reproduced, but 11 of these required author assistance. For 13 articles at least one value could not be reproduced despite author assistance. Importantly, there were no clear indications that original conclusions were seriously impacted. Mandatory open data policies can increase the frequency and quality of data sharing. However, suboptimal data curation, unclear analysis specification and reporting errors can impede analytic reproducibility, undermining the utility of data sharing and the credibility of scientific findings.

Subject:
Applied Science
Information Science
Material Type:
Reading
Provider:
Royal Society Open Science
Author:
Alicia Hofelich Mohr
Bria Long
Elizabeth Clayton
Erica J. Yoon
George C. Banks
Gustav Nilsonne
Kyle MacDonald
Mallory C. Kidwell
Maya B. Mathur
Michael C. Frank
Michael Henry Tessler
Richie L. Lenne
Sara Altman
Tom E. Hardwicke
Date Added:
08/07/2020
Data policies of highly-ranked social science journals
Unrestricted Use
CC BY
Rating
0.0 stars

By encouraging and requiring that authors share their data in order to publish articles, scholarly journals have become an important actor in the movement to improve the openness of data and the reproducibility of research. But how many social science journals encourage or mandate that authors share the data supporting their research findings? How does the share of journal data policies vary by discipline? What influences these journals’ decisions to adopt such policies and instructions? And what do those policies and instructions look like? We discuss the results of our analysis of the instructions and policies of 291 highly-ranked journals publishing social science research, where we studied the contents of journal data policies and instructions across 14 variables, such as when and how authors are asked to share their data, and what role journal ranking and age play in the existence and quality of data policies and instructions. We also compare our results to the results of other studies that have analyzed the policies of social science journals, although differences in the journals chosen and how each study defines what constitutes a data policy limit this comparison.We conclude that a little more than half of the journals in our study have data policies. A greater share of the economics journals have data policies and mandate sharing, followed by political science/international relations and psychology journals. Finally, we use our findings to make several recommendations: Policies should include the terms “data,� “dataset� or more specific terms that make it clear what to make available; policies should include the benefits of data sharing; journals, publishers, and associations need to collaborate more to clarify data policies; and policies should explicitly ask for qualitative data.

Subject:
Psychology
Social Science
Material Type:
Reading
Author:
Abigail Schwartz
Dessi Kirilova
Gerard Otalora
Julian Gautier
Mercè Crosas
Sebastian Karcher
Date Added:
08/07/2020
Data sharing in PLOS ONE: An analysis of Data Availability Statements
Unrestricted Use
CC BY
Rating
0.0 stars

A number of publishers and funders, including PLOS, have recently adopted policies requiring researchers to share the data underlying their results and publications. Such policies help increase the reproducibility of the published literature, as well as make a larger body of data available for reuse and re-analysis. In this study, we evaluate the extent to which authors have complied with this policy by analyzing Data Availability Statements from 47,593 papers published in PLOS ONE between March 2014 (when the policy went into effect) and May 2016. Our analysis shows that compliance with the policy has increased, with a significant decline over time in papers that did not include a Data Availability Statement. However, only about 20% of statements indicate that data are deposited in a repository, which the PLOS policy states is the preferred method. More commonly, authors state that their data are in the paper itself or in the supplemental information, though it is unclear whether these data meet the level of sharing required in the PLOS policy. These findings suggest that additional review of Data Availability Statements or more stringent policies may be needed to increase data sharing.

Subject:
Applied Science
Computer Science
Health, Medicine and Nursing
Information Science
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Alicia Livinski
Christopher W. Belter
Douglas J. Joubert
Holly Thompson
Lisa M. Federer
Lissa N. Snyders
Ya-Ling Lu
Date Added:
08/07/2020
Foster Open Science
Unrestricted Use
CC BY
Rating
0.0 stars

The FOSTER portal is an e-learning platform that brings together the best training resources addressed to those who need to know more about Open Science, or need to develop strategies and skills for implementing Open Science practices in their daily workflows. Here you will find a growing collection of training materials. Many different users - from early-career researchers, to data managers, librarians, research administrators, and graduate schools - can benefit from the portal. In order to meet their needs, the existing materials will be extended from basic to more advanced-level resources. In addition, discipline-specific resources will be created.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Full Course
Provider:
FOSTER Open Science
Author:
FOSTER Open Science
Date Added:
08/07/2020
Funder Data-Sharing Policies: Overview and Recommendations
Unrestricted Use
CC BY
Rating
0.0 stars

This report covers funder data-sharing policies/practices, and provides recommendations to funders and others as they consider their own policies. It was commissioned by Robert Wood Johnson Foundation in 2017. If any comments or questions, please contact Stephanie Wykstra (stephanie.wykstra@gmail.com).

Subject:
Applied Science
Health, Medicine and Nursing
Life Science
Social Science
Material Type:
Reading
Author:
Stephanie Wykstra
Date Added:
08/07/2020
OpenAccess.net
Unrestricted Use
CC BY
Rating
0.0 stars

The open-access.net platform provides comprehensive information on the subject of Open Access (OA) and offers practical advice on its implementation. Developed collaboratively by the Freie Universität Berlin and the Universities of Goettingen, Konstanz, and Bielefeld, open-access.net first went online at the beginning of May 2007. The platform's target groups include all relevant stakeholders in the science sector, especially the scientists and scholars themselves, university and research institution managers, infrastructure service providers such as libraries and data centres, and funding agencies and policy makers. open-access.net provides easy, one-stop access to comprehensive information on OA.

Aspects covered include OA concepts, legal, organisational and technical frameworks, concrete implementation experiences, initiatives, services, service providers, and position papers. The target-group-oriented and discipline-specific presentation of the content enables users to access relevant themes quickly and efficiently. Moreover, the platform offers practical implementation advice and answers to fundamental questions regarding OA.
In collaboration with cooperation partners in Austria (the University of Vienna) and Switzerland (the University of Zurich), country-specific web pages for these two countries have been integrated into the platform - especially in the Legal Issues section.

Each year since 2007, the information platform has organised the "Open Access Days" at alternating venues in collaboration with local partners. This event is the key conference on OA and Open Science in the German-speaking area.

With funding from the Ministry of Science, Research and the Arts (MWK) of the State of Baden-Württemberg, the platform underwent a complete technical and substantive overhaul in 2015.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
OpenAccess Germany
Author:
OpenAccess Germany
Date Added:
06/18/2020
Open Science Manual
Conditional Remix & Share Permitted
CC BY-NC
Rating
0.0 stars

About This Document: This manual was assembled and is being updated by Professor Benjamin Le (@benjaminle), who is on the faculty in the Department of Psychology at Haverford College. The primary goal of this text is to provide guidance to his senior thesis students on how to conduct research in his lab by working within general principles that promote research transparency using the specific open science practices described here. While it is aimed at undergraduate psychology students, hopefully it will be of use to other faculty/researchers/students who are interested in adopting open science practices in their labs.

Subject:
Psychology
Social Science
Material Type:
Reading
Author:
Benjamin Le
Date Added:
05/01/2018
An Open Science Primer for Social Scientists
Unrestricted Use
CC BY
Rating
0.0 stars

“Open Science” has become a buzzword in academic circles. However, exactly what it means, why you should care about it, and – most importantly – how it can be put into practice is often not very clear to researchers. In this session of the SSDL, we will provide a brief tour d'horizon of Open Science in which we touch on all of these issues and by which we hope to equip you with a basic understanding of Open Science and a practical tool kit to help you make your research more open to other researchers and the larger interested public. Throughout the presentation, we will focus on giving you an overview of tools and services that can help you open up your research workflow and your publications, all the way from enhancing the reproducibility of your research and making it more collaborative to finding outlets which make the results of your work accessible to everyone. Absolutely no prior experience with open science is required to participate in this talk which should lead into an open conversation among us as a community about the best practices we can and should follow for a more open social science.

Subject:
Social Science
Material Type:
Lesson
Author:
Eike Mark Rinke
Date Added:
06/21/2017
Open Science Toolbox
Unrestricted Use
CC BY
Rating
0.0 stars

There is a vast body of helpful tools that can be used in order to foster Open Science practices. For reasons of clarity, this toolbox aims at providing only a selection of links to these resources and tools. Our goal is to give a short overview on possibilities of how to enhance your Open Science practices without consuming too much of your time.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
Uni Muenchen
Author:
Lutz Heil
Date Added:
07/10/2019
The Open Science Training Handbook
Read the Fine Print
Some Rights Reserved
Rating
0.0 stars

Open Science, the movement to make scientific products and processes accessible to and reusable by all, is about culture and knowledge as much as it is about technologies and services. Convincing researchers of the benefits of changing their practices, and equipping them with the skills and knowledge needed to do so, is hence an important task.This book offers guidance and resources for Open Science instructors and trainers, as well as anyone interested in improving levels of transparency and participation in research practices. Supporting and connecting an emerging Open Science community that wishes to pass on its knowledge, the handbook suggests training activities that can be adapted to various settings and target audiences. The book equips trainers with methods, instructions, exemplary training outlines and inspiration for their own Open Science trainings. It provides Open Science advocates across the globe with practical know-how to deliver Open Science principles to researchers and support staff. What works, what doesn’t? How can you make the most of limited resources? Here you will find a wealth of resources to help you build your own training events.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
FOSTER Open Science
Author:
FOSTER Open Science
Date Added:
06/18/2020
Open Science: What, Why, and How
Unrestricted Use
CC BY
Rating
0.0 stars

Open Science is a collection of actions designed to make scientific processes more transparent and results more accessible. Its goal is to build a more replicable and robust science; it does so using new technologies, altering incentives, and changing attitudes. The current movement towards open science was spurred, in part, by a recent “series of unfortunate events” within psychology and other sciences. These events include the large number of studies that have failed to replicate and the prevalence of common research and publication procedures that could explain why. Many journals and funding agencies now encourage, require, or reward some open science practices, including pre-registration, providing full materials, posting data, distinguishing between exploratory and confirmatory analyses, and running replication studies. Individuals can practice and encourage open science in their many roles as researchers, authors, reviewers, editors, teachers, and members of hiring, tenure, promotion, and awards committees. A plethora of resources are available to help scientists, and science, achieve these goals.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Author:
Bobbie Spellman
Elizabeth Gilbert
Katherine Corker
Date Added:
07/02/2018
Pre-analysis Plans: A Stocktaking
Read the Fine Print
Rating
0.0 stars

The evidence-based community has championed the public registration of pre-analysis plans (PAPs) as a solution to the problem of research credibility, but without any evidence that PAPs actually bolster the credibility of research. We analyze a representative sample of 195 pre-analysis plans (PAPs) from the American Economic Association (AEA) and Evidence in Governance and Politics (EGAP) registration platforms to assess whether PAPs are sufficiently clear, precise and comprehensive to be able to achieve their objectives of preventing “fishing” and reducing the scope for post-hoc adjustment of research hypotheses. We also analyze a subset of 93 PAPs from projects that have resulted in publicly available papers to ascertain how faithfully they adhere to their pre-registered specifications and hypotheses. We find significant variation in the extent to which PAPs are accomplishing the goals they were designed to achieve

Subject:
Economics
Social Science
Material Type:
Reading
Author:
Daniel Posner
George Ofosu
Date Added:
08/07/2020
Public Availability of Published Research Data in High-Impact Journals
Unrestricted Use
CC BY
Rating
0.0 stars

Background There is increasing interest to make primary data from published research publicly available. We aimed to assess the current status of making research data available in highly-cited journals across the scientific literature. Methods and Results We reviewed the first 10 original research papers of 2009 published in the 50 original research journals with the highest impact factor. For each journal we documented the policies related to public availability and sharing of data. Of the 50 journals, 44 (88%) had a statement in their instructions to authors related to public availability and sharing of data. However, there was wide variation in journal requirements, ranging from requiring the sharing of all primary data related to the research to just including a statement in the published manuscript that data can be available on request. Of the 500 assessed papers, 149 (30%) were not subject to any data availability policy. Of the remaining 351 papers that were covered by some data availability policy, 208 papers (59%) did not fully adhere to the data availability instructions of the journals they were published in, most commonly (73%) by not publicly depositing microarray data. The other 143 papers that adhered to the data availability instructions did so by publicly depositing only the specific data type as required, making a statement of willingness to share, or actually sharing all the primary data. Overall, only 47 papers (9%) deposited full primary raw data online. None of the 149 papers not subject to data availability policies made their full primary data publicly available. Conclusion A substantial proportion of original research papers published in high-impact journals are either not subject to any data availability policies, or do not adhere to the data availability instructions in their respective journals. This empiric evaluation highlights opportunities for improvement.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Alawi A. Alsheikh-Ali
John P. A. Ioannidis
Mouaz H. Al-Mallah
Waqas Qureshi
Date Added:
08/07/2020
Rigor and Reproducibility | grants.nih.gov
Read the Fine Print
Rating
0.0 stars

The information provided on this website is designed to assist the extramural community in addressing rigor and transparency in NIH grant applications and progress reports. Scientific rigor and transparency in conducting biomedical research is key to the successful application of knowledge toward improving health outcomes.

Definition
Scientific rigor is the strict application of the scientific method to ensure unbiased and well-controlled experimental design, methodology, analysis, interpretation and reporting of results.

Goals
The NIH strives to exemplify and promote the highest level of scientific integrity, public accountability, and social responsibility in the conduct of science. Grant applications instructions and the criteria by which reviewers are asked to evaluate the scientific merit of the application are intended to:

• ensure that NIH is funding the best and most rigorous science,
• highlight the need for applicants to describe details that may have been previously overlooked,
• highlight the need for reviewers to consider such details in their reviews through updated review language, and
• minimize additional burden.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Author:
NIH
Date Added:
08/07/2020
Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability
Unrestricted Use
CC BY
Rating
0.0 stars

An academic scientist’s professional success depends on publishing. Publishing norms emphasize novel, positive results. As such, disciplinary incentives encourage design, analysis, and reporting decisions that elicit positive results and ignore negative results. Prior reports demonstrate how these incentives inflate the rate of false effects in published science. When incentives favor novelty over replication, false results persist in the literature unchallenged, reducing efficiency in knowledge accumulation. Previous suggestions to address this problem are unlikely to be effective. For example, a journal of negative results publishes otherwise unpublishable reports. This enshrines the low status of the journal and its content. The persistence of false findings can be meliorated with strategies that make the fundamental but abstract accuracy motive—getting it right—competitive with the more tangible and concrete incentive—getting it published. This article develops strategies for improving scientific practices and knowledge accumulation that account for ordinary human motivations and biases.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Perspectives on Psychological Science
Author:
Brian A. Nosek
Jeffrey R. Spies
Matt Motyl
Date Added:
08/07/2020
TOP Guidelines
Read the Fine Print
Rating
0.0 stars

The Transparency and Openness Promotion guidelines include eight modular standards, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to implement and select a level of implementation for each. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
Open Science Collaboration
Author:
Open Science Collaboration
Date Added:
06/26/2015
Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals
Unrestricted Use
CC BY
Rating
0.0 stars

Journal policy on research data and code availability is an important part of the ongoing shift toward publishing reproducible computational science. This article extends the literature by studying journal data sharing policies by year (for both 2011 and 2012) for a referent set of 170 journals. We make a further contribution by evaluating code sharing policies, supplemental materials policies, and open access status for these 170 journals for each of 2011 and 2012. We build a predictive model of open data and code policy adoption as a function of impact factor and publisher and find higher impact journals more likely to have open data and code policies and scientific societies more likely to have open data and code policies than commercial publishers. We also find open data policies tend to lead open code policies, and we find no relationship between open data and code policies and either supplemental material policies or open access journal status. Of the journals in this study, 38% had a data policy, 22% had a code policy, and 66% had a supplemental materials policy as of June 2012. This reflects a striking one year increase of 16% in the number of data policies, a 30% increase in code policies, and a 7% increase in the number of supplemental materials policies. We introduce a new dataset to the community that categorizes data and code sharing, supplemental materials, and open access policies in 2011 and 2012 for these 170 journals.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Peixuan Guo
Victoria Stodden
Zhaokun Ma
Date Added:
08/07/2020