Updating search results...

Search Resources

329 Results

View
Selected filters:
  • oskb
Managing a Personal Research Archive
Conditional Remix & Share Permitted
CC BY-NC
Rating
0.0 stars

A class on setting up and managing research materials; caring for digital files to enable collaboration, sharing, and re-use; and helpful software/digital tools for organizing personal research files.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
New York University
Author:
Nick Wolf
Vicky Steeves
Date Added:
01/06/2020
Mapping the universe of registered reports
Read the Fine Print
Rating
0.0 stars

Registered reports present a substantial departure from traditional publishing models with the goal of enhancing the transparency and credibility of the scientific literature. We map the evolving universe of registered reports to assess their growth, implementation and shortcomings at journals across scientific disciplines.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Nature Human Behaviour
Author:
John P. A. Ioannidis
Tom E. Hardwicke
Date Added:
08/07/2020
Materials for the Webinar "Helping Science Succeed: The Librarian’s Role in Addressing the Reproducibility Crisis"
Conditional Remix & Share Permitted
CC BY-NC
Rating
0.0 stars

Headlines and scholarly publications portray a crisis in biomedical and health sciences. In this webinar, you will learn what the crisis is and the vital role of librarians in addressing it. You will see how you can directly and immediately support reproducible and rigorous research using your expertise and your library services. You will explore reproducibility guidelines and recommendations and develop an action plan for engaging researchers and stakeholders at your institution. #MLAReproducibilityLearning OutcomesBy the end of this webinar, participants will be able to: describe the basic history of the “reproducibility crisis” and define reproducibility and replicability explain why librarians have a key role in addressing concerns about reproducibility, specifically in terms of the packaging of science explain 3-4 areas where librarians can immediately and directly support reproducible research through existing expertise and services start developing an action plan to engage researchers and stakeholders at their institution about how they will help address research reproducibility and rigorAudienceLibrarians who work with researchers; librarians who teach, conduct, or assist with evidence-synthesis or critical appraisal, and managers and directors who are interested in allocating resources toward supporting research rigor. No prior knowledge or skills required. Basic knowledge of scholarly research and publishing helpful.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
UMN
Author:
Amy Riegelman
Frank Sayre
Date Added:
02/13/2020
The Meaningfulness of Effect Sizes in Psychological Research: Differences Between Sub-Disciplines and the Impact of Potential Biases
Unrestricted Use
CC BY
Rating
0.0 stars

Effect sizes are the currency of psychological research. They quantify the results of a study to answer the research question and are used to calculate statistical power. The interpretation of effect sizes—when is an effect small, medium, or large?—has been guided by the recommendations Jacob Cohen gave in his pioneering writings starting in 1962: Either compare an effect with the effects found in past research or use certain conventional benchmarks. The present analysis shows that neither of these recommendations is currently applicable. From past publications without pre-registration, 900 effects were randomly drawn and compared with 93 effects from publications with pre-registration, revealing a large difference: Effects from the former (median r = .36) were much larger than effects from the latter (median r = .16). That is, certain biases, such as publication bias or questionable research practices, have caused a dramatic inflation in published effects, making it difficult to compare an actual effect with the real population effects (as these are unknown). In addition, there were very large differences in the mean effects between psychological sub-disciplines and between different study designs, making it impossible to apply any global benchmarks. Many more pre-registered studies are needed in the future to derive a reliable picture of real population effects.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Frontiers in Psychology
Author:
Marcus A. Schwarz
Thomas Schäfer
Date Added:
08/07/2020
Meeting the Requirements of Funders Around Open Science: Open Resources and Processes for Education
Unrestricted Use
CC BY
Rating
0.0 stars

Expectations by funders for transparent and reproducible methods are on the rise. This session covers expectations for preregistration, data sharing, and open access results of three key funders of education research including the Institute of Education Sciences, the National Science Foundation, and Arnold Ventures. Presenters cover practical resources for meeting these requirements such as the Registry for Efficacy and Effectiveness Studies (REES), the Open Science Framework (OSF), and EdArXiv. Presenters: Jessaca Spybrook, Western Michigan University Bryan Cook, University of Virginia David Mellor, Center for Open Science

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Meta-assessment of bias in science
Unrestricted Use
CC BY
Rating
0.0 stars

Numerous biases are believed to affect the scientific literature, but their actual prevalence across disciplines is unknown. To gain a comprehensive picture of the potential imprint of bias in science, we probed for the most commonly postulated bias-related patterns and risk factors, in a large random sample of meta-analyses taken from all disciplines. The magnitude of these biases varied widely across fields and was overall relatively small. However, we consistently observed a significant risk of small, early, and highly cited studies to overestimate effects and of studies not published in peer-reviewed journals to underestimate them. We also found at least partial confirmation of previous evidence suggesting that US studies and early studies might report more extreme effects, although these effects were smaller and more heterogeneously distributed across meta-analyses and disciplines. Authors publishing at high rates and receiving many citations were, overall, not at greater risk of bias. However, effect sizes were likely to be overestimated by early-career researchers, those working in small or long-distance collaborations, and those responsible for scientific misconduct, supporting hypotheses that connect bias to situational factors, lack of mutual control, and individual integrity. Some of these patterns and risk factors might have modestly increased in intensity over time, particularly in the social sciences. Our findings suggest that, besides one being routinely cautious that published small, highly-cited, and earlier studies may yield inflated results, the feasibility and costs of interventions to attenuate biases in the literature might need to be discussed on a discipline-specific and topic-specific basis.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
National Academy of Sciences
Author:
Daniele Fanelli
John P. A. Ioannidis
Rodrigo Costas
Date Added:
08/07/2020
Metascience Forum 2020 - YouTube
Unrestricted Use
CC BY
Rating
0.0 stars

In his talk, Professor Nosek defines replication as gathering evidence that tests an empirical claim made in an original paper. This intent influences the design and interpretation of a replication study and addresses confusion between conceptual and direct replications.
---
Are you a funder interested in supporting research on the scientific process? Learn more about the communities mobilizing around the emerging field of metascience by visiting metascience.com. Funders are encouraged to review and adopt the practices overviewed at cos.io/top-funders as part of the solution to issues discussed during the Funders Forum.

Subject:
Education
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Brian Nosek
Date Added:
03/21/2021
Misaligned Incentives Hurt Science, but We Can Fix Them
Unrestricted Use
CC BY
Rating
0.0 stars

In this talk, Professor Corker shows how researchers are typically evaluated and contrasts that with ideal ways to evaluate the process of scientific output. Funding for open practices, infrastructure, and publication decisions made regardless of outcome incentivize the type of science we want to see occur.
---
Are you a funder interested in supporting research on the scientific process? Learn more about the communities mobilizing around the emerging field of metascience by visiting metascience.com. Funders are encouraged to review and adopt the practices overviewed at cos.io/top-funders as part of the solution to issues discussed during the Funders Forum.

Subject:
Education
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Katie Corker
Date Added:
03/21/2021
NIGMS Clearinghouse for Training Modules to Enhance Data Reproducibility
Read the Fine Print
Rating
0.0 stars

In January 2014, NIH launched a series of initiatives to enhance rigor and reproducibility in research. As a part of this initiative, NIGMS, along with nine other NIH institutes and centers, issued a funding opportunity announcement (FOA) RFA-GM-15-006 to develop, pilot, and disseminate training modules to enhance data reproducibility. This FOA was reissued in 2018 (RFA-GM-18-002).For the benefit of the scientific community, we will post the products of grants funded by these FOAs on this website as they become available. In addition, we are sharing here other relevant training modules developed, including courses developed from administrative supplements to NIGMS predoctoral T32 grants.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lecture
Provider:
NIH
Author:
National Institutes of Health
Date Added:
08/07/2020
No evidence of publication bias in climate change science
Unrestricted Use
CC BY
Rating
0.0 stars

Non-significant results are less likely to be reported by authors and, when submitted for peer review, are less likely to be published by journal editors. This phenomenon, known collectively as publication bias, is seen in a variety of scientific disciplines and can erode public trust in the scientific method and the validity of scientific theories. Public trust in science is especially important for fields like climate change science, where scientific consensus can influence state policies on a global scale, including strategies for industrial and agricultural management and development. Here, we used meta-analysis to test for biases in the statistical results of climate change articles, including 1154 experimental results from a sample of 120 articles. Funnel plots revealed no evidence of publication bias given no pattern of non-significant results being under-reported, even at low sample sizes. However, we discovered three other types of systematic bias relating to writing style, the relative prestige of journals, and the apparent rise in popularity of this field: First, the magnitude of statistical effects was significantly larger in the abstract than the main body of articles. Second, the difference in effect sizes in abstracts versus main body of articles was especially pronounced in journals with high impact factors. Finally, the number of published articles about climate change and the magnitude of effect sizes therein both increased within 2 years of the seminal report by the Intergovernmental Panel on Climate Change 2007.

Subject:
Physical Science
Material Type:
Reading
Provider:
Climatic Change
Author:
Christian Harlos
Johan Hollander
Tim C. Edgell
Date Added:
08/07/2020
OSF101
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar walks you through the basics of creating an OSF project, structuring it to fit your research needs, adding collaborators, and tying your favorite online tools into your project structure. OSF is a free, open source web application built by the Center for Open Science, a non-profit dedicated to improving the alignment between scientific values and scientific practices. OSF is part collaboration tool, part version control software, and part data archive. It is designed to connect to popular tools researchers already use, like Dropbox, Box, Github, and Mendeley, to streamline workflows and increase efficiency.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
OSF Collections: supporting research discoverability and reuse
Unrestricted Use
CC BY
Rating
0.0 stars

The OSF Collections repository platform supports the discoverability and reuse of research by enabling the aggregation of related projects across OSF. With OSF Collections, any funder, journal, society, or research community can show their commitment to scientific integrity by aggregating the open outputs from their disciplines, grantees, journal articles, or more. Learn how research collections can foster new norms for sharing, collaboration, and reproducibility.

We also provide a demo of how OSF Collections aggregates and hosts your research by discipline, funded outcomes, project type, journal issue, and more.

Visit cos.io/collections to learn more.

Subject:
Education
Material Type:
Lesson
Provider:
Center for Open Science
Date Added:
03/31/2021
OSF In The Lab: Organizing related projects  with Links, Forks, and Templates
Unrestricted Use
CC BY
Rating
0.0 stars

Files for this webinar are available at: https://osf.io/ewhvq/ This webinar focuses on how to use the Open Science Framework (OSF) to tie together and organize multiple projects. We look at example structures appropriate for organizing classroom projects, a line of research, or a whole lab's activity. We discuss the OSF's capabilities for using projects as templates, linking projects, and forking projects as well as some considerations for using each of those capabilities when designing a structure for your own project. The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github and Mendeley, to streamline workflows and increase efficiency.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
OSF in the Classroom
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar will introduce how to use the Open Science Framework (OSF; https://osf.io) in a Classroom. The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github and Mendeley, to streamline workflows and increase efficiency. This webinar will discuss how to introduce reproducible research practices to students, show ways of tracking student activity, and introduce the use of Templates and Forks on the OSF to allow students to easily make new class projects. The OSF is the flagship product of the Center for Open Science, a non-profit technology start-up dedicated to improving the alignment between scientific values and scientific practices. Learn more at cos.io and osf.io, or email contact@cos.io.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
OSKB Group Admin Editor Roles and Responsibilities
Unrestricted Use
CC BY
Rating
0.0 stars

Each OSKB Group will have Group Admininistrators. Group Administrators act as Editors in the OSKB, and have related roles and responsibilities. This document is an evolving document maintained by members of the OSKB Editorial Subcommittee to provide transparency on how the editorial process works, who is responsible for what, and how editorial conflicts or administrative issues may be resolved.

Subject:
Education
Material Type:
Student Guide
Author:
OSKB Admin
Date Added:
08/07/2020
OSKB Roadmap
Unrestricted Use
CC BY
Rating
0.0 stars

 Welcome to the Open Scholarship Knowledge Base Roadmap!  Help people quickly and confidently learn and apply open scholarship practices.What are we planning to do in our project? The OSKB Roadmap is a living document that shows where we are heading in the short term and long term. Please check out our Contributor Guides and Code of Conduct to help get started!

Subject:
Applied Science
Education
Information Science
Material Type:
Diagram/Illustration
Author:
OSKB Admin
Date Added:
07/22/2020
On the Plurality of (Methodological) Worlds: Estimating the Analytic Flexibility of fMRI Experiments
Unrestricted Use
CC BY
Rating
0.0 stars

How likely are published findings in the functional neuroimaging literature to be false? According to a recent mathematical model, the potential for false positives increases with the flexibility of analysis methods. Functional MRI (fMRI) experiments can be analyzed using a large number of commonly used tools, with little consensus on how, when, or whether to apply each one. This situation may lead to substantial variability in analysis outcomes. Thus, the present study sought to estimate the flexibility of neuroimaging analysis by submitting a single event-related fMRI experiment to a large number of unique analysis procedures. Ten analysis steps for which multiple strategies appear in the literature were identified, and two to four strategies were enumerated for each step. Considering all possible combinations of these strategies yielded 6,912 unique analysis pipelines. Activation maps from each pipeline were corrected for multiple comparisons using five thresholding approaches, yielding 34,560 significance maps. While some outcomes were relatively consistent across pipelines, others showed substantial methods-related variability in activation strength, location, and extent. Some analysis decisions contributed to this variability more than others, and different decisions were associated with distinct patterns of variability across the brain. Qualitative outcomes also varied with analysis parameters: many contrasts yielded significant activation under some pipelines but not others. Altogether, these results reveal considerable flexibility in the analysis of fMRI experiments. This observation, when combined with mathematical simulations linking analytic flexibility with elevated false positive rates, suggests that false positive results may be more prevalent than expected in the literature. This risk of inflated false positive rates may be mitigated by constraining the flexibility of analytic choices or by abstaining from selective analysis reporting.

Subject:
Applied Science
Biology
Health, Medicine and Nursing
Life Science
Psychology
Social Science
Material Type:
Reading
Provider:
Frontiers in Neuroscience
Author:
Joshua Carp
Date Added:
08/07/2020
On the reproducibility of science: unique identification of research resources in the biomedical literature
Unrestricted Use
CC BY
Rating
0.0 stars

Scientific reproducibility has been at the forefront of many news stories and there exist numerous initiatives to help address this problem. We posit that a contributor is simply a lack of specificity that is required to enable adequate research reproducibility. In particular, the inability to uniquely identify research resources, such as antibodies and model organisms, makes it difficult or impossible to reproduce experiments even where the science is otherwise sound. In order to better understand the magnitude of this problem, we designed an experiment to ascertain the “identifiability” of research resources in the biomedical literature. We evaluated recent journal articles in the fields of Neuroscience, Developmental Biology, Immunology, Cell and Molecular Biology and General Biology, selected randomly based on a diversity of impact factors for the journals, publishers, and experimental method reporting guidelines. We attempted to uniquely identify model organisms (mouse, rat, zebrafish, worm, fly and yeast), antibodies, knockdown reagents (morpholinos or RNAi), constructs, and cell lines. Specific criteria were developed to determine if a resource was uniquely identifiable, and included examining relevant repositories (such as model organism databases, and the Antibody Registry), as well as vendor sites. The results of this experiment show that 54% of resources are not uniquely identifiable in publications, regardless of domain, journal impact factor, or reporting requirements. For example, in many cases the organism strain in which the experiment was performed or antibody that was used could not be identified. Our results show that identifiability is a serious problem for reproducibility. Based on these results, we provide recommendations to authors, reviewers, journal editors, vendors, and publishers. Scientific efficiency and reproducibility depend upon a research-wide improvement of this substantial problem in science today.

Subject:
Biology
Life Science
Social Science
Material Type:
Reading
Provider:
PeerJ
Author:
Gregory M. LaRocca
Holly Paddock
Laura Ponting
Matthew H. Brush
Melissa A. Haendel
Nicole A. Vasilevsky
Shreejoy J. Tripathy
Date Added:
08/07/2020
Open Access Directory
Unrestricted Use
CC BY
Rating
0.0 stars

The Open Access Directory is an online compendium of factual lists about open access to science and scholarship, maintained by the community at large. It exists as a wiki hosted by the School of Library and Information Science at Simmons University in Boston, USA. The goal is for the open access community itself to enlarge and correct the lists with little intervention from the editors or editorial board. For quality control, editing privileges are granted to registered users. As far as possible, lists are limited to brief factual statements without narrative or opinion.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
OAD Simmons
Author:
OAD Simmons
Date Added:
08/07/2020