Updating search results...

OSKB

This endorsement is the seal of approval for inclusion in the OSKB Library collections.

These resources have been vetted by the OSKB Team.

329 affiliated resources

Search Resources

View
Selected filters:
R para Análisis Científicos Reproducibles
Unrestricted Use
CC BY
Rating
0.0 stars

Una introducción a R utilizando los datos de Gapminder. El objetivo de esta lección es enseñar a las programadoras principiantes a escribir códigos modulares y adoptar buenas prácticas en el uso de R para el análisis de datos. R nos provee un conjunto de paquetes desarrollados por terceros que se usan comúnmente en diversas disciplinas científicas para el análisis estadístico. Encontramos que muchos científicos que asisten a los talleres de Software Carpentry utilizan R y quieren aprender más. Nuestros materiales son relevantes ya que proporcionan a los asistentes una base sólida en los fundamentos de R y enseñan las mejores prácticas del cómputo científico: desglose del análisis en módulos, automatización tareas y encapsulamiento. Ten en cuenta que este taller se enfoca en los fundamentos del lenguaje de programación R y no en el análisis estadístico. A lo largo de este taller se utilizan una variedad de paquetes desarrolados por terceros, los cuales no son necesariamente los mejores ni se encuentran explicadas todas sus funcionalidades, pero son paquetes que consideramos útiles y han sido elegidos principalmente por su facilidad de uso.

Subject:
Applied Science
Computer Science
Information Science
Mathematics
Measurement and Data
Material Type:
Module
Provider:
The Carpentries
Author:
A. s
Alejandra Gonzalez-Beltran
Ana Beatriz Villaseñor Altamirano
Antonio
AntonioJBT
Belinda Weaver
Claudia Engel
Cynthia Monastirsky
Daniel Beiter
David Mawdsley
David Pérez-Suárez
Erin Becker
EuniceML
François Michonneau
Gordon McDonald
Guillermina Actis
Guillermo Movia
Hely Salgado
Ido Bar
Ivan Ogasawara
Ivonne Lujano
James J Balamuta
Jamie McDevitt-Irwin
Jeff Oliver
Jonah Duckles
Juan M. Barrios
Katrin Leinweber
Kevin Alquicira
Kevin Martínez-Folgar
Laura Angelone
Laura-Gomez
Leticia Vega
Marcela Alfaro Córdoba
Marceline Abadeer
Maria Florencia D'Andrea
Marie-Helene Burle
Marieke Frassl
Matias Andina
Murray Cadzow
Narayanan Raghupathy
Naupaka Zimmerman
Paola Prieto
Paula Andrea Martinez
Raniere Silva
Rayna M Harris
Richard Barnes
Richard McCosh
Romualdo Zayas-Lagunas
Sandra Brosda
Sasha Lavrentovich
Shirley Alquicira Hernandez
Silvana Pereyra
Tobin Magle
Veronica Jimenez
juli arancio
raynamharris
saynomoregrl
Date Added:
08/07/2020
SPARC Popular Resources
Unrestricted Use
CC BY
Rating
0.0 stars

SPARC is a global coalition committed to making Open the default for research and education. SPARC empowers people to solve big problems and make new discoveries through the adoption of policies and practices that advance Open Access, Open Data, and Open Education.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
SPARC
Author:
Nick Shockey
Date Added:
01/31/2020
Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth Over Publishability
Unrestricted Use
CC BY
Rating
0.0 stars

An academic scientist’s professional success depends on publishing. Publishing norms emphasize novel, positive results. As such, disciplinary incentives encourage design, analysis, and reporting decisions that elicit positive results and ignore negative results. Prior reports demonstrate how these incentives inflate the rate of false effects in published science. When incentives favor novelty over replication, false results persist in the literature unchallenged, reducing efficiency in knowledge accumulation. Previous suggestions to address this problem are unlikely to be effective. For example, a journal of negative results publishes otherwise unpublishable reports. This enshrines the low status of the journal and its content. The persistence of false findings can be meliorated with strategies that make the fundamental but abstract accuracy motive—getting it right—competitive with the more tangible and concrete incentive—getting it published. This article develops strategies for improving scientific practices and knowledge accumulation that account for ordinary human motivations and biases.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Perspectives on Psychological Science
Author:
Brian A. Nosek
Jeffrey R. Spies
Matt Motyl
Date Added:
08/07/2020
Secondary Data Preregistration
Unrestricted Use
Public Domain
Rating
0.0 stars

Preregistration is the process of specifying project details, such as hypotheses, data collection procedures, and analytical decisions, prior to conducting a study. It is designed to make a clearer distinction between data-driven, exploratory work and a-priori, confirmatory work. Both modes of research are valuable, but are easy to unintentionally conflate. See the Preregistration Revolution for more background and recommendations.

For research that uses existing datasets, there is an increased risk of analysts being biased by preliminary trends in the dataset. However, that risk can be balanced by proper blinding to any summary statistics in the dataset and the use of hold out datasets (where the "training" and "validation" datasets are kept separate from each other). See this page for specific recommendations about "split samples" or "hold out" datasets. Finally, if those procedures are not followed, disclosure of possible biases can inform the researcher and her audience about the proper role any results should have (i.e. the results should be deemed mostly exploratory and ideal for additional confirmation).

This project contains a template for creating your preregistration, designed specifically for research using existing data. In the future, this template will be integrated into the OSF.

Subject:
Applied Science
Material Type:
Reading
Author:
Alexander C. DeHaven
Andrew Hall
Brian Brown
Charles R. Ebersole
Courtney K. Soderberg
David Thomas Mellor
Elliott Kruse
Jerome Olsen
Jessica Kosie
K. D. Valentine
Lorne Campbell
Marjan Bakker
Olmo van den Akker
Pamela Davis-Kean
Rodica I. Damian
Stuart J. Ritchie
Thuy-vy Ngugen
William J. Chopik
Sara J. Weston
Date Added:
08/12/2021
Sharing Detailed Research Data Is Associated with Increased Citation Rate
Unrestricted Use
CC BY
Rating
0.0 stars

Background Sharing research data provides benefit to the general scientific community, but the benefit is less obvious for the investigator who makes his or her data available. Principal Findings We examined the citation history of 85 cancer microarray clinical trial publications with respect to the availability of their data. The 48% of trials with publicly available microarray data received 85% of the aggregate citations. Publicly available data was significantly (p = 0.006) associated with a 69% increase in citations, independently of journal impact factor, date of publication, and author country of origin using linear regression. Significance This correlation between publicly available data and increased literature impact may further motivate investigators to share their detailed research data.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Douglas B. Fridsma
Heather A. Piwowar
Roger S. Day
Date Added:
08/07/2020
A Short Introduction to the Reproducibility Debate in Psychology
Unrestricted Use
CC BY
Rating
0.0 stars

The Journal of European Psychology Students (JEPS) is an open-access, double-blind, peer-reviewed journal for psychology students worldwide. JEPS is run by highly motivated European psychology students and has been publishing since 2009. By ensuring that authors are always provided with extensive feedback, JEPS gives psychology students the chance to gain experience in publishing and to improve their scientific skills. Furthermore, JEPS provides students with the opportunity to share their research and to take a first step toward a scientific career.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Journal of European Psychology Students
Author:
Cedric Galetzka
Date Added:
08/07/2020
Signaling the trustworthiness of science
Unrestricted Use
CC BY
Rating
0.0 stars

Trust in science increases when scientists and the outlets certifying their work honor science’s norms. Scientists often fail to signal to other scientists and, perhaps more importantly, the public that these norms are being upheld. They could do so as they generate, certify, and react to each other’s findings: for example, by promoting the use and value of evidence, transparent reporting, self-correction, replication, a culture of critique, and controls for bias. A number of approaches for authors and journals would lead to more effective signals of trustworthiness at the article level. These include article badging, checklists, a more extensive withdrawal ontology, identity verification, better forward linking, and greater transparency.

Subject:
Life Science
Social Science
Material Type:
Reading
Provider:
National Academy of Sciences
Author:
Kathleen Hall Jamieson
Marcia McNutt
Richard Sever
Veronique Kiermer
Date Added:
08/07/2020
A Social Psychological Model of Scientific Practices: Explaining Research Practices and Outlining the Potential for Successful Reforms
Unrestricted Use
CC BY
Rating
0.0 stars

A crescendo of incidents have raised concerns about whether scientific practices in psychology may be suboptimal, sometimes leading to the publication, dissemination, and application of unreliable or misinterpreted findings. Psychology has been a leader in identifying possibly suboptimal practices and proposing reforms that might enhance the efficiency of the scientific process and the publication of robust evidence and interpretations. To help shape future efforts, this paper offers a model of the psychological and socio-structural forces and processes that may influence scientists’ practices. The model identifies practices targeted by interventions and reforms, and which practices remain unaddressed. The model also suggests directions for empirical research to assess how best to enhance the effectiveness of psychological inquiry.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Psychologica Belgica
Author:
Jon A. Krosnick
Lee Jussim
Sean T. Stevens
Stephanie M. Anglin
Date Added:
08/07/2020
Social Science Workshop Overview
Unrestricted Use
CC BY
Rating
0.0 stars

Workshop overview for the Data Carpentry Social Sciences curriculum. Data Carpentry’s aim is to teach researchers basic concepts, skills, and tools for working with data so that they can get more done in less time, and with less pain. This workshop teaches data management and analysis for social science research including best practices for data organization in spreadsheets, reproducible data cleaning with OpenRefine, and data analysis and visualization in R. This curriculum is designed to be taught over two full days of instruction. Materials for teaching data analysis and visualization in Python and extraction of information from relational databases using SQL are in development. Interested in teaching these materials? We have an onboarding video and accompanying slides available to prepare Instructors to teach these lessons. After watching this video, please contact team@carpentries.org so that we can record your status as an onboarded Instructor. Instructors who have completed onboarding will be given priority status for teaching at centrally-organized Data Carpentry Social Sciences workshops.

Subject:
Applied Science
Information Science
Mathematics
Measurement and Data
Social Science
Material Type:
Module
Provider:
The Carpentries
Author:
Angela Li
Erin Alison Becker
Francois Michonneau
Maneesha Sane
Sarah Brown
Tracy Teal
Date Added:
08/07/2020
Software Carpentry
Unrestricted Use
CC BY
Rating
0.0 stars

Since 1998, Software Carpentry has been teaching researchers the computing skills they need to get more done in less time and with less pain. Our volunteer instructors have run hundreds of events for more than 34,000 researchers since 2012. All of our lesson materials are freely reusable under the Creative Commons - Attribution license.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Full Course
Provider:
Software Carpentry Community
Author:
Software Carpentry Community
Date Added:
06/18/2020
Statcheck
Read the Fine Print
Some Rights Reserved
Rating
0.0 stars

statcheck is a program that checks for errors in statistical reporting in APA-formatted documents. It was originally written in the R programming language. statcheck/web is a web-based implementation of statcheck. Using statcheck/web, you can check any PDF for statistical errors without installing the R programming language on your computer.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Activity/Lab
Provider:
Statcheck
Author:
Michele B. Nuijten
Sacha Epskamp
Date Added:
08/07/2020
Statistics and Quantitative Methods Example Videos for Teaching
Unrestricted Use
CC BY
Rating
0.0 stars

The goal of this repository is to index and host short videos that can be used to supplement the teaching of introductory statistics concepts. The purpose of these videos is to show students examples of statistical concepts being used in real research, to build off of foundational understanding of the concept they were introduced to in class. Let's show students real people, doing real research, and using real baby statistics to solve science! If you are interested to contribute a video to this page, please read the wiki, which explains what is needed in more depth. If you are still interested to contribute at that point, please request access as a contributor for the specific component(s) you would like to contribute to. If you would like to contribute a video for a topic that is not listed as a component, please contact JK Flake.

Subject:
Education
Material Type:
Lesson
Author:
Jessica Kay Flake
Date Added:
08/30/2021
Statistics of DOOM
Read the Fine Print
Rating
0.0 stars

About Stats of DOOM

Support Statistics of DOOM! This page and the YouTube channel to help people learn statistics by including step-by-step instructions for SPSS, R, Excel, and other programs. Demonstrations are provided including power, data screening, analysis, write up tips, effect sizes, and graphs. Help guides and course materials are also provided!

When I originally started posting my videos on YouTube, I never really thought people would be interested in them - minus a few overachieving students. I am glad that I've been able to help so many folks! I have taught many statistics courses - you can view full classes by using the Learn tab in the top right. I have also taught cognitive and language courses, some with coding (see the NLP and Language Modeling courses), and some without (see Other Courses). I hope this website provides structure to all my materials for you to use for yourself or your classroom.

Each page has an example syllabus, video lectures laid out with that syllabus (if I have them!), and links to the appropriate materials. Any broken links can be reported by sending me an email (linked at the bottom). Stats Tools was designed for learning statistics, which morphed into learning coding, open science, statistics, and more! Recommendations, comments, and other questions are welcome with the general suggestion to post on the specific video or page you have a question on. I do my best to answer, but also work a full-time job.

These resources wouldn't be possible without the help of many fantastic people over the years including:

All the Help Desk TAs: Rachel E. Monroe, Marshall Beauchamp, Louis Oberdiear, Simone Donaldson, Kim Koch, Jessica Willis, Samantha Hunter, Flora Forbes, Tabatha Hopke
Research colleagues: K.D. Valentine, John E. Scofield, Jeff Pavlacic
And more! Pages with specific content made by others are noted on that page.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lecture
Provider:
StatsTools
Author:
Erin M. Buchanan
Date Added:
08/07/2020
Statistics with JASP and the Open Science Framework
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar will introduce the integration of JASP Statistical Software (https://jasp-stats.org/) with the Open Science Framework (OSF; https://osf.io). The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github, Mendeley, and now is integrated with JASP, to streamline workflows and increase efficiency.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Supporting Open Science Data Curation, Preservation, and Access by Libraries
Unrestricted Use
CC BY
Rating
0.0 stars

Openness in research can lead to greater reproducibility, an accelerated pace of discovery, and decreased redundancy of effort. In addition, open research ensures equitable access to knowledge and the ability for any community to assess, interrogate, and build upon prior work. It also requires open infrastructure and distributed access; but few institutions can provide all of these services alone. Providing a trustworthy network for perpetual availability of research data is critical to ensuring reproducibility, transparency, and ongoing inquiry.

Increased attention on the importance of open research and data sharing has led to a proliferation of platforms to store data, materials, etc., with limited technical integration. This can hinder data sharing, but also complicate coordination with local library expertise and services, thus hampering curation and long-term stewardship.

For example, the open source OSF enables researchers to directly create and manage research projects and integrates with other tools researchers use (Google Drive, Dropbox, Box, etc.), but lacks the ability to archive that material locally at a researcher’s institution. Long-term stewardship and preservation requires multiple copies of data archived in different locations, and creating archives seamlessly would be ideal.

COS and IA are working together to address these preservation and stewardship challenges by providing open, cooperative infrastructure to ensure long-term access and connection to research data, and by supporting and promoting adoption of open science practices to enhance research reproducibility as well as data sharing and reuse.

In this webinar, attendees will learn about both the technical and practical aspects of this collaborative project connecting the researcher tool OSF and the preservation system of Internet Archive. We demonstrate how researchers can improve the openness and reproducibility of their research through preregistration, and how those preregistrations are preserved with Internet Archive. We answer questions and explore use cases for how this powerful workflow can support library curation and stewardship of open research.

Subject:
Education
Material Type:
Lecture
Provider:
Center for Open Science
Date Added:
03/21/2021
Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias — An Updated Review
Unrestricted Use
CC BY
Rating
0.0 stars

Background The increased use of meta-analysis in systematic reviews of healthcare interventions has highlighted several types of bias that can arise during the completion of a randomised controlled trial. Study publication bias and outcome reporting bias have been recognised as a potential threat to the validity of meta-analysis and can make the readily available evidence unreliable for decision making. Methodology/Principal Findings In this update, we review and summarise the evidence from cohort studies that have assessed study publication bias or outcome reporting bias in randomised controlled trials. Twenty studies were eligible of which four were newly identified in this update. Only two followed the cohort all the way through from protocol approval to information regarding publication of outcomes. Fifteen of the studies investigated study publication bias and five investigated outcome reporting bias. Three studies have found that statistically significant outcomes had a higher odds of being fully reported compared to non-significant outcomes (range of odds ratios: 2.2 to 4.7). In comparing trial publications to protocols, we found that 40–62% of studies had at least one primary outcome that was changed, introduced, or omitted. We decided not to undertake meta-analysis due to the differences between studies. Conclusions This update does not change the conclusions of the review in which 16 studies were included. Direct empirical evidence for the existence of study publication bias and outcome reporting bias is shown. There is strong evidence of an association between significant results and publication; studies that report positive or significant results are more likely to be published and outcomes that are statistically significant have higher odds of being fully reported. Publications have been found to be inconsistent with their protocols. Researchers need to be aware of the problems of both types of bias and efforts should be concentrated on improving the reporting of trials.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Carrol Gamble
Jamie J. Kirkham
Kerry Dwan
Paula R. Williamson
Date Added:
08/07/2020
TOP Guidelines
Read the Fine Print
Rating
0.0 stars

The Transparency and Openness Promotion guidelines include eight modular standards, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to implement and select a level of implementation for each. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
Open Science Collaboration
Author:
Open Science Collaboration
Date Added:
06/26/2015