Updating search results...

OSKB

This endorsement is the seal of approval for inclusion in the OSKB Library collections.

These resources have been vetted by the OSKB Team.

329 affiliated resources

Search Resources

View
Selected filters:
Recommendations for Increasing Replicability in Psychology: Recommendations for increasing replicability
Unrestricted Use
CC BY
Rating
0.0 stars

Replicability of findings is at the heart of any empirical science. The aim of this article is to move the current replicability debate in psychology towards concrete recommendations for improvement. We focus on research practices but also offer guidelines for reviewers, editors, journal management, teachers, granting institutions, and university promotion committees, highlighting some of the emerging and existing practical solutions that can facilitate implementation of these recommendations. The challenges for improving replicability in psychological science are systemic. Improvement can occur only if changes are made at many levels of practice, evaluation, and reward.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
European Journal of Personality
Author:
Brent W. Roberts
Brian A. Nosek
David C. Funder
Filip De Fruyt
Hannelore Weber
Jaap J. A. Denissen
Jan De Houwer
Jelte M. Wicherts
Jens B. Asendorpf
Klaus Fiedler
Manfred Schmitt
Marcel A. G. van Aken
Marco Perugini
Mark Conner
Reinhold Kliegl
Susann Fiedler
Date Added:
08/07/2020
Registered Reports Q&A
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar addresses questions related to writing, reviewing, editing, or funding a study using the Registered Report format, featuring Chris Chambers and ...

Subject:
Education
Material Type:
Lesson
Provider:
Center for Open Science
Author:
Chris Chambers
david mellor
Date Added:
03/31/2021
Registered reports: an early example and analysis
Unrestricted Use
CC BY
Rating
0.0 stars

The recent ‘replication crisis’ in psychology has focused attention on ways of increasing methodological rigor within the behavioral sciences. Part of this work has involved promoting ‘Registered Reports’, wherein journals peer review papers prior to data collection and publication. Although this approach is usually seen as a relatively recent development, we note that a prototype of this publishing model was initiated in the mid-1970s by parapsychologist Martin Johnson in the European Journal of Parapsychology (EJP). A retrospective and observational comparison of Registered and non-Registered Reports published in the EJP during a seventeen-year period provides circumstantial evidence to suggest that the approach helped to reduce questionable research practices. This paper aims both to bring Johnson’s pioneering work to a wider audience, and to investigate the positive role that Registered Reports may play in helping to promote higher methodological and statistical standards.

Subject:
Applied Science
Information Science
Psychology
Social Science
Material Type:
Reading
Provider:
PeerJ
Author:
Caroline Watt
Diana Kornbrot
Richard Wiseman
Date Added:
08/07/2020
Releasing a preprint is associated with more attention and citations for the peer-reviewed article
Unrestricted Use
CC BY
Rating
0.0 stars

Preprints in biology are becoming more popular, but only a small fraction of the articles published in peer-reviewed journals have previously been released as preprints. To examine whether releasing a preprint on bioRxiv was associated with the attention and citations received by the corresponding peer-reviewed article, we assembled a dataset of 74,239 articles, 5,405 of which had a preprint, published in 39 journals. Using log-linear regression and random-effects meta-analysis, we found that articles with a preprint had, on average, a 49% higher Altmetric Attention Score and 36% more citations than articles without a preprint. These associations were independent of several other article- and author-level variables (such as scientific subfield and number of authors), and were unrelated to journal-level variables such as access model and Impact Factor. This observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
eLife
Author:
Darwin Y Fu
Jacob J Hughey
Date Added:
08/07/2020
Replicability and Reproducibility in Comparative Psychology
Unrestricted Use
CC BY
Rating
0.0 stars

Replicability and Reproducibility in Comparative Psychology Psychology faces a replication crisis. The Reproducibility Project: Psychology sought to replicate the effects of 100 psychology studies. Though 97% of the original studies produced statistically significant results, only 36% of the replication studies did so (Open Science Collaboration, 2015). This inability to replicate previously published results, however, is not limited to psychology (Ioannidis, 2005). Replication projects in medicine (Prinz et al., 2011) and behavioral economics (Camerer et al., 2016) resulted in replication rates of 25 and 61%, respectively, and analyses in genetics (Munafò, 2009) and neuroscience (Button et al., 2013) question the validity of studies in those fields. Science, in general, is reckoning with challenges in one of its basic tenets: replication. Comparative psychology also faces the grand challenge of producing replicable research. Though social psychology has born the brunt of most of the critique regarding failed replications, comparative psychology suffers from some of the same problems faced by social psychology (e.g., small sample sizes). Yet, comparative psychology follows the methods of cognitive psychology by often using within-subjects designs, which may buffer it from replicability problems (Open Science Collaboration, 2015). In this Grand Challenge article, I explore the shared and unique challenges of and potential solutions for replication and reproducibility in comparative psychology.

Subject:
Economics
Psychology
Social Science
Material Type:
Reading
Provider:
Frontiers in Psychology
Author:
Jeffrey R. Stevens
Date Added:
08/07/2020
Replication and Bias in (Special) Education Research Base
Unrestricted Use
CC BY
Rating
0.0 stars

This is a clearinghouse for resources related to open science in Special Education. If you find a good resource that has not been included, please email it to marcy@cos.io.

Subject:
Education
Material Type:
Reading
Author:
OSKB Admin
Date Added:
05/03/2021
Reporting in Experimental Philosophy: Current Standards and Recommendations for Future Practice
Unrestricted Use
CC BY
Rating
0.0 stars

Recent replication crises in psychology and other fields have led to intense reflection about the validity of common research practices. Much of this reflection has focussed on reporting standards, and how they may be related to the questionable research practices that could underlie a high proportion of irreproducible findings in the published record. As a developing field, it is particularly important for Experimental Philosophy to avoid some of the pitfalls that have beset other disciplines. To this end, here we provide a detailed, comprehensive assessment of current reporting practices in Experimental Philosophy. We focus on the quality of statistical reporting and the disclosure of information about study methodology. We assess all the articles using quantitative methods (n = 134) that were published over the years 2013–2016 in 29 leading philosophy journals. We find that null hypothesis significance testing is the prevalent statistical practice in Experimental Philosophy, although relying solely on this approach has been criticised in the psychological literature. To augment this approach, various additional measures have become commonplace in other fields, but we find that Experimental Philosophy has adopted these only partially: 53% of the papers report an effect size, 28% confidence intervals, 1% examined prospective statistical power and 5% report observed statistical power. Importantly, we find no direct relation between an article’s reporting quality and its impact (numbers of citations). We conclude with recommendations for authors, reviewers and editors in Experimental Philosophy, to facilitate making research statistically-transparent and reproducible.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Review of Philosophy and Psychology
Author:
Andrea Polonioli
Brittany Blankinship
David Carmel
Mariana Vega-Mendoza
Date Added:
08/07/2020
ReproducibiliTea
Read the Fine Print
Rating
0.0 stars

Everything you need to know about this ECR-led journal club initiative that helps early career researchers create local Open Science groups that discuss issues, papers and ideas to do with improving science.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
UK Reproducibility Network
Date Added:
06/18/2020
Reproducibility Immersive Course
Conditional Remix & Share Permitted
CC BY-SA
Rating
0.0 stars

Various fields in the natural and social sciences face a ‘crisis of confidence’. Broadly, this crisis amounts to a pervasiveness of non-reproducible results in the published literature. For example, in the field of biomedicine, Amgen published findings that out of 53 landmark published results of pre-clinical studies, only 11% could be replicated successfully. This crisis is not confined to biomedicine. Areas that have recently received attention for non-reproducibility include biomedicine, economics, political science, psychology, as well as philosophy. Some scholars anticipate the expansion of this crisis to other disciplines.This course explores the state of reproducibility. After giving a brief historical perspective, case studies from different disciplines (biomedicine, psychology, and philosophy) are examined to understand the issues concretely. Subsequently, problems that lead to non-reproducibility are discussed as well as possible solutions and paths forward.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
New York University
Author:
Vicky Steeves
Date Added:
06/01/2018
Reproducibility Librarianship in Practice
Unrestricted Use
CC BY
Rating
0.0 stars

As research across domains of study has become increasingly reliant on digital tools (librarianship included), the challenges in reproducibility have grown. Alongside this reproducibility challenge are the demands for open scholarship, such as releasing code, data, and articles under an open license.Before, researchers out in the field used to capture their environments through observation, drawings, photographs, and videos; now, researchers and the librarians who work alongside them must capture digital environments and what they contain (e.g. code and data) to achieve reproducibility. Librarians are well-positioned to help patrons open their scholarship, and it’s time to build in reproducibility as a part of our services.Librarians are already engaged with research data management, open access publishing, grant compliance, pre-registration, and it’s time we as a profession add reproducibility to that repertoire. In this webinar, organised by LIBER’s Research Data Management Working Group, speaker Vicky Steeves discusses how she’s built services around reproducibility as a dual appointment between the Libraries and the Center for Data Science at New York University.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
New York University
Author:
Birgit Schmidt
Vicky Steeves
Date Added:
12/04/2018
Reproducibility, Preservation, and Access to Research with ReproZip and ReproServer
Conditional Remix & Share Permitted
CC BY-SA
Rating
0.0 stars

The adoption of reproducibility remains low, despite incentives becoming increasingly common in different domains, conferences, and journals. The truth is, reproducibility is technically difficult to achieve due to the complexities of computational environments.To address these technical challenges, we created ReproZip, an open-source tool that packs research along with all the necessary information to reproduce it, including data files, software, OS version, and environment variables. Everything is then bundled into an .rpz file, which users can use to reproduce the work with ReproUnzip and an unpacker (Docker, Vagrant, and Singularity). The .rpz file is general and contains rich metadata: more unpackers can be added as needed, better guaranteeing long-term preservation.However, installing the unpackers can still be burdensome for secondary users of ReproZip bundles. In this paper, we will discuss how ReproZip and our new tool ReproServer can be used together to facilitate access to well-preserved, reproducible work. ReproServer is a cloud application that allows users to upload or provide a link to a ReproZip bundle, and then interact with/reproduce the contents from the comfort of their browser. Users are then provided a stable link to the unpacked work on ReproServer they can share with reviewers or colleagues.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
New York University
Author:
Fernando Chirigati
Rémi Rampin
Vicky Steeves
Date Added:
05/31/2019
Reproducibility in Cancer Biology: The challenges of replication
Unrestricted Use
CC BY
Rating
0.0 stars

Interpreting the first results from the Reproducibility Project: Cancer Biology requires a highly nuanced approach. Reproducibility is a cornerstone of science, and the development of new drugs and medical treatments relies on the results of preclinical research being reproducible. In recent years, however, the validity of published findings in a number of areas of scientific research, including cancer research, have been called into question (Begley and Ellis, 2012; Baker, 2016). One response to these concerns has been the launch of a project to repeat selected experiments from a number of high-profile papers in cancer biology (Morrison, 2014; Errington et al., 2014). The aim of the Reproducibility Project: Cancer Biology, which is a collaboration between the Center for Open Science and Science Exchange, is two-fold: to provide evidence about reproducibility in preclinical cancer research, and to identify the factors that influence reproducibility more generally.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
eLife
Author:
eLife Editors
Date Added:
08/07/2020
Reproducible Research
Read the Fine Print
Rating
0.0 stars

Modern scientific research takes advantage of programs such as Python and R that are open source. As such, they can be modified and shared by the wider community. Additionally, there is added functionality through additional programs and packages, such as IPython, Sweave, and Shiny. These packages can be used to not only execute data analyses, but also to present data and results consistently across platforms (e.g., blogs, websites, repositories and traditional publishing venues).

The goal of the course is to show how to implement analyses and share them using IPython for Python, Sweave and knitr for RStudio to create documents that are shareable and analyses that are reproducible.

Course outline is as follows:
1) Use of IPython notebooks to demonstrate and explain code, visualize data, and display analysis results
2) Applications of Python modules such as SymPy, NumPy, pandas, and SciPy
3) Use of Sweave to demonstrate and explain code, visualize data, display analysis results, and create documents and presentations
4) Integration and execution of IPython and R code and analyses using the IPython notebook

Subject:
Applied Science
Information Science
Material Type:
Full Course
Author:
Christopher Ahern
Date Added:
08/07/2020
Reproducible Research Methods
Read the Fine Print
Rating
0.0 stars

This is the website for the Autumn 2014 course “Reproducible Research Methods” taught by Eric C. Anderson at NOAA’s Southwest Fisheries Science Center. The course meets on Tuesdays and Thursdays from 3:30 to 4:30 PM in Room 188 of the Fisheries Ecology Division.
It runs from Oct 7 to December 18.

The goal of this course is for scientists, researchers, and students to learn:

to write programs in the R language to manipulate and analyze data,
to integrate data analysis with report generation and article preparation using knitr,
to work fluently within the Rstudio integrated development environment for R,
to use git version control software and GitHub to effectively manage source code, collaborate efficiently with other researchers, and neatly package their research.

Subject:
Applied Science
Information Science
Material Type:
Full Course
Author:
Eric C. Anderson
Date Added:
08/07/2020
Reproducible Research: Walking the Walk
Read the Fine Print
Rating
0.0 stars

Description

This hands-on tutorial will train reproducible research warriors on the practices and tools that make experimental verification possible with an end-to-end data analysis workflow. The tutorial will expose attendees to open science methods during data gathering, storage, analysis, up to publication into a reproducible article.

Attendees are expected to have basic familiarity with scientific Python and Git.

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Matt McCormick
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Automation
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
François Michonneau
Kim Gilbert
Matt Pennell
Date Added:
08/07/2020
Reproducible Science Curriculum Lesson for Literate Programming
Read the Fine Print
Rating
0.0 stars

Workshop goals
- Why are we teaching this
- Why is this important
- For future and current you
- For research as a whole
- Lack of reproducibility in research is a real problem

Materials and how we'll use them
- Workshop landing page, with

- links to the Materials
- schedule

Structure oriented along the Four Facets of Reproducibility:

- Documentation
- Organization
- Automation
- Dissemination

Will be available after the Workshop

How this workshop is run
- This is a Carpentries Workshop
- that means friendly learning environment
- Code of Conduct
- active learning
- work with the people next to you
- ask for help

Subject:
Applied Science
Information Science
Material Type:
Module
Author:
Ciera Martinez
Courtney Soderberg
Hilmar Lapp
Jennifer Bryan
Kristina Riemer
Naupaka Zimmerman
Date Added:
08/07/2020