Updating search results...

Search Resources

100 Results

View
Selected filters:
Power, Profit, and Privilege: Problematizing Scholarly Publishing
Conditional Remix & Share Permitted
CC BY-NC
Rating
0.0 stars

This open course introduces students to the scholarly communications system — with particular emphasis on the scholarly journal publishing mechanism — wherein new information is created, evaluated, disseminated, and preserved.

The course content is organized into three parts. First, The Fundamentals aims to acquaint students with the basic framework of contemporary scholarly publishing: how it operates, who is involved, what roles they play, etc., as well as asking students to consider how they themselves might engage with the system as consumers and producers of scholarly knowledge. Chapters include sample exercises to reinforce content, as well as recommended resources for further study. Next, (Some) Problems raises questions and issues that complicate contemporary scholarly publishing. While scholarship and research have the noble goal of building and sharing new knowledge for the public good, they are also inextricably bound to real-world economic structures and inequalities. This section examines how the scholarly publishing system intersects with money, power, and privilege. It asks students to grapple with the system’s structural, systemic failings, as well as contemplate ways in which it might be improved. Finally, the course culminates in two final Assignments that instructors can use as part of the curriculum, or that independent learners can work through on their own. These are open-ended in that there are no discrete right or wrong answers, but rather opportunities for students to grapple with and reflect on the content of the course.

Material in this course can be used in classroom settings or as self-paced tutorial. Appropriate audiences include upper-level undergraduate or graduate students who are interested in publishing their work; library & information science (LIS) students or early-career librarians interested in scholarly communications; and anyone else who wants a better understanding of the scholarly publishing system and the academic culture in which it is rooted.

Subject:
Applied Science
Information Science
Material Type:
Homework/Assignment
Reading
Author:
Amanda Makula
Date Added:
05/23/2022
Preregistration: Improve Research Rigor, Reduce Bias
Unrestricted Use
CC BY
Rating
0.0 stars

In this webinar Professor Brian Nosek, Executive Director of the Center for Open Science (https://cos.io), outlines the practice of Preregistration and how it can aid in increasing the rigor and reproducibility of research. The webinar is co-hosted by the Health Research Alliance, a collaborative member organization of nonprofit research funders. Slides available at: https://osf.io/9m6tx/

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Preregistration in Complex Contexts: A Preregistration Template for the Application of Cognitive Models
Unrestricted Use
CC BY
Rating
0.0 stars

In recent years, open science practices have become increasingly popular in psychology and related sciences. These practices aim to increase rigour and transparency in science as a potential response to the challenges posed by the replication crisis. Many of these reforms -- including the highly influential preregistration -- have been designed for experimental work that tests simple hypotheses with standard statistical analyses, such as assessing whether an experimental manipulation has an effect on a variable of interest. However, psychology is a diverse field of research, and the somewhat narrow focus of the prevalent discussions surrounding and templates for preregistration has led to debates on how appropriate these reforms are for areas of research with more diverse hypotheses and more complex methods of analysis, such as cognitive modelling research within mathematical psychology. Our article attempts to bridge the gap between open science and mathematical psychology, focusing on the type of cognitive modelling that Crüwell, Stefan, & Evans (2019) labelled model application, where researchers apply a cognitive model as a measurement tool to test hypotheses about parameters of the cognitive model. Specifically, we (1) discuss several potential researcher degrees of freedom within model application, (2) provide the first preregistration template for model application, and (3) provide an example of a preregistered model application using our preregistration template. More broadly, we hope that our discussions and proposals constructively advance the debate surrounding preregistration in cognitive modelling, and provide a guide for how preregistration templates may be developed in other diverse or complex research contexts.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Author:
Nathan Evans
Sophia Crüwell
Date Added:
12/07/2019
Public Availability of Published Research Data in High-Impact Journals
Unrestricted Use
CC BY
Rating
0.0 stars

Background There is increasing interest to make primary data from published research publicly available. We aimed to assess the current status of making research data available in highly-cited journals across the scientific literature. Methods and Results We reviewed the first 10 original research papers of 2009 published in the 50 original research journals with the highest impact factor. For each journal we documented the policies related to public availability and sharing of data. Of the 50 journals, 44 (88%) had a statement in their instructions to authors related to public availability and sharing of data. However, there was wide variation in journal requirements, ranging from requiring the sharing of all primary data related to the research to just including a statement in the published manuscript that data can be available on request. Of the 500 assessed papers, 149 (30%) were not subject to any data availability policy. Of the remaining 351 papers that were covered by some data availability policy, 208 papers (59%) did not fully adhere to the data availability instructions of the journals they were published in, most commonly (73%) by not publicly depositing microarray data. The other 143 papers that adhered to the data availability instructions did so by publicly depositing only the specific data type as required, making a statement of willingness to share, or actually sharing all the primary data. Overall, only 47 papers (9%) deposited full primary raw data online. None of the 149 papers not subject to data availability policies made their full primary data publicly available. Conclusion A substantial proportion of original research papers published in high-impact journals are either not subject to any data availability policies, or do not adhere to the data availability instructions in their respective journals. This empiric evaluation highlights opportunities for improvement.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Alawi A. Alsheikh-Ali
John P. A. Ioannidis
Mouaz H. Al-Mallah
Waqas Qureshi
Date Added:
08/07/2020
Publication Bias in Psychology: A Diagnosis Based on the Correlation between Effect Size and Sample Size
Unrestricted Use
CC BY
Rating
0.0 stars

Background The p value obtained from a significance test provides no information about the magnitude or importance of the underlying phenomenon. Therefore, additional reporting of effect size is often recommended. Effect sizes are theoretically independent from sample size. Yet this may not hold true empirically: non-independence could indicate publication bias. Methods We investigate whether effect size is independent from sample size in psychological research. We randomly sampled 1,000 psychological articles from all areas of psychological research. We extracted p values, effect sizes, and sample sizes of all empirical papers, and calculated the correlation between effect size and sample size, and investigated the distribution of p values. Results We found a negative correlation of r = −.45 [95% CI: −.53; −.35] between effect size and sample size. In addition, we found an inordinately high number of p values just passing the boundary of significance. Additional data showed that neither implicit nor explicit power analysis could account for this pattern of findings. Conclusion The negative correlation between effect size and samples size, and the biased distribution of p values indicate pervasive publication bias in the entire field of psychology.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
PLOS ONE
Author:
Anton Kühberger
Astrid Fritz
Thomas Scherndl
Date Added:
08/07/2020
P values in display items are ubiquitous and almost invariably significant: A survey of top science journals
Unrestricted Use
CC BY
Rating
0.0 stars

P values represent a widely used, but pervasively misunderstood and fiercely contested method of scientific inference. Display items, such as figures and tables, often containing the main results, are an important source of P values. We conducted a survey comparing the overall use of P values and the occurrence of significant P values in display items of a sample of articles in the three top multidisciplinary journals (Nature, Science, PNAS) in 2017 and, respectively, in 1997. We also examined the reporting of multiplicity corrections and its potential influence on the proportion of statistically significant P values. Our findings demonstrated substantial and growing reliance on P values in display items, with increases of 2.5 to 14.5 times in 2017 compared to 1997. The overwhelming majority of P values (94%, 95% confidence interval [CI] 92% to 96%) were statistically significant. Methods to adjust for multiplicity were almost non-existent in 1997, but reported in many articles relying on P values in 2017 (Nature 68%, Science 48%, PNAS 38%). In their absence, almost all reported P values were statistically significant (98%, 95% CI 96% to 99%). Conversely, when any multiplicity corrections were described, 88% (95% CI 82% to 93%) of reported P values were statistically significant. Use of Bayesian methods was scant (2.5%) and rarely (0.7%) articles relied exclusively on Bayesian statistics. Overall, wider appreciation of the need for multiplicity corrections is a welcome evolution, but the rapid growth of reliance on P values and implausibly high rates of reported statistical significance are worrisome.

Subject:
Mathematics
Statistics and Probability
Material Type:
Reading
Provider:
PLOS ONE
Author:
Ioana Alina Cristea
John P. A. Ioannidis
Date Added:
08/07/2020
Registered Reports Q&A
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar addresses questions related to writing, reviewing, editing, or funding a study using the Registered Report format, featuring Chris Chambers and ...

Subject:
Education
Material Type:
Lesson
Provider:
Center for Open Science
Author:
Chris Chambers
david mellor
Date Added:
03/31/2021
Registered reports: an early example and analysis
Unrestricted Use
CC BY
Rating
0.0 stars

The recent ‘replication crisis’ in psychology has focused attention on ways of increasing methodological rigor within the behavioral sciences. Part of this work has involved promoting ‘Registered Reports’, wherein journals peer review papers prior to data collection and publication. Although this approach is usually seen as a relatively recent development, we note that a prototype of this publishing model was initiated in the mid-1970s by parapsychologist Martin Johnson in the European Journal of Parapsychology (EJP). A retrospective and observational comparison of Registered and non-Registered Reports published in the EJP during a seventeen-year period provides circumstantial evidence to suggest that the approach helped to reduce questionable research practices. This paper aims both to bring Johnson’s pioneering work to a wider audience, and to investigate the positive role that Registered Reports may play in helping to promote higher methodological and statistical standards.

Subject:
Applied Science
Information Science
Psychology
Social Science
Material Type:
Reading
Provider:
PeerJ
Author:
Caroline Watt
Diana Kornbrot
Richard Wiseman
Date Added:
08/07/2020
Releasing a preprint is associated with more attention and citations for the peer-reviewed article
Unrestricted Use
CC BY
Rating
0.0 stars

Preprints in biology are becoming more popular, but only a small fraction of the articles published in peer-reviewed journals have previously been released as preprints. To examine whether releasing a preprint on bioRxiv was associated with the attention and citations received by the corresponding peer-reviewed article, we assembled a dataset of 74,239 articles, 5,405 of which had a preprint, published in 39 journals. Using log-linear regression and random-effects meta-analysis, we found that articles with a preprint had, on average, a 49% higher Altmetric Attention Score and 36% more citations than articles without a preprint. These associations were independent of several other article- and author-level variables (such as scientific subfield and number of authors), and were unrelated to journal-level variables such as access model and Impact Factor. This observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
eLife
Author:
Darwin Y Fu
Jacob J Hughey
Date Added:
08/07/2020
Risk of Bias in Reports of In Vivo Research: A Focus for Improvement
Unrestricted Use
CC BY
Rating
0.0 stars

The reliability of experimental findings depends on the rigour of experimental design. Here we show limited reporting of measures to reduce the risk of bias in a random sample of life sciences publications, significantly lower reporting of randomisation in work published in journals of high impact, and very limited reporting of measures to reduce the risk of bias in publications from leading United Kingdom institutions. Ascertainment of differences between institutions might serve both as a measure of research quality and as a tool for institutional efforts to improve research quality.

Subject:
Biology
Life Science
Material Type:
Reading
Provider:
PLOS Biology
Author:
Aaron Lawson McLean
Aikaterini Kyriakopoulou
Andrew Thomson
Aparna Potluru
Arno de Wilde
Cristina Nunes-Fonseca
David W. Howells
Emily S. Sena
Gillian L. Currie
Hanna Vesterinen
Julija Baginskitae
Kieren Egan
Leonid Churilov
Malcolm R. Macleod
Nicki Sherratt
Rachel Hemblade
Stylianos Serghiou
Theo Hirst
Zsanett Bahor
Date Added:
08/07/2020
SPARC Popular Resources
Unrestricted Use
CC BY
Rating
0.0 stars

SPARC is a global coalition committed to making Open the default for research and education. SPARC empowers people to solve big problems and make new discoveries through the adoption of policies and practices that advance Open Access, Open Data, and Open Education.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Reading
Provider:
SPARC
Author:
Nick Shockey
Date Added:
01/31/2020
Scholarly Communication Librarianship and Open Knowledge
Conditional Remix & Share Permitted
CC BY-NC
Rating
0.0 stars

The intersection of scholarly communication librarianship and open education offers a unique opportunity to expand knowledge of scholarly communication topics in both education and practice. Open resources can address the gap in teaching timely and critical scholarly communication topics—copyright in teaching and research environments, academic publishing, emerging modes of scholarship, impact measurement—while increasing access to resources and equitable participation in education and scholarly communication.

Subject:
Applied Science
Information Science
Material Type:
Textbook
Provider:
Association of College and Research Libraries
Author:
Josh Bolick
Will Cross
Date Added:
02/01/2024
A Short Introduction to the Reproducibility Debate in Psychology
Unrestricted Use
CC BY
Rating
0.0 stars

The Journal of European Psychology Students (JEPS) is an open-access, double-blind, peer-reviewed journal for psychology students worldwide. JEPS is run by highly motivated European psychology students and has been publishing since 2009. By ensuring that authors are always provided with extensive feedback, JEPS gives psychology students the chance to gain experience in publishing and to improve their scientific skills. Furthermore, JEPS provides students with the opportunity to share their research and to take a first step toward a scientific career.

Subject:
Psychology
Social Science
Material Type:
Reading
Provider:
Journal of European Psychology Students
Author:
Cedric Galetzka
Date Added:
08/07/2020
Statistics with JASP and the Open Science Framework
Unrestricted Use
CC BY
Rating
0.0 stars

This webinar will introduce the integration of JASP Statistical Software (https://jasp-stats.org/) with the Open Science Framework (OSF; https://osf.io). The OSF is a free, open source web application built to help researchers manage their workflows. The OSF is part collaboration tool, part version control software, and part data archive. The OSF connects to popular tools researchers already use, like Dropbox, Box, Github, Mendeley, and now is integrated with JASP, to streamline workflows and increase efficiency.

Subject:
Applied Science
Computer Science
Information Science
Material Type:
Lecture
Provider:
Center for Open Science
Author:
Center for Open Science
Date Added:
08/07/2020
Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting Bias — An Updated Review
Unrestricted Use
CC BY
Rating
0.0 stars

Background The increased use of meta-analysis in systematic reviews of healthcare interventions has highlighted several types of bias that can arise during the completion of a randomised controlled trial. Study publication bias and outcome reporting bias have been recognised as a potential threat to the validity of meta-analysis and can make the readily available evidence unreliable for decision making. Methodology/Principal Findings In this update, we review and summarise the evidence from cohort studies that have assessed study publication bias or outcome reporting bias in randomised controlled trials. Twenty studies were eligible of which four were newly identified in this update. Only two followed the cohort all the way through from protocol approval to information regarding publication of outcomes. Fifteen of the studies investigated study publication bias and five investigated outcome reporting bias. Three studies have found that statistically significant outcomes had a higher odds of being fully reported compared to non-significant outcomes (range of odds ratios: 2.2 to 4.7). In comparing trial publications to protocols, we found that 40–62% of studies had at least one primary outcome that was changed, introduced, or omitted. We decided not to undertake meta-analysis due to the differences between studies. Conclusions This update does not change the conclusions of the review in which 16 studies were included. Direct empirical evidence for the existence of study publication bias and outcome reporting bias is shown. There is strong evidence of an association between significant results and publication; studies that report positive or significant results are more likely to be published and outcomes that are statistically significant have higher odds of being fully reported. Publications have been found to be inconsistent with their protocols. Researchers need to be aware of the problems of both types of bias and efforts should be concentrated on improving the reporting of trials.

Subject:
Applied Science
Health, Medicine and Nursing
Material Type:
Reading
Provider:
PLOS ONE
Author:
Carrol Gamble
Jamie J. Kirkham
Kerry Dwan
Paula R. Williamson
Date Added:
08/07/2020
TOP Guidelines
Read the Fine Print
Rating
0.0 stars

The Transparency and Openness Promotion guidelines include eight modular standards, each with three levels of increasing stringency. Journals select which of the eight transparency standards they wish to implement and select a level of implementation for each. These features provide flexibility for adoption depending on disciplinary variation, but simultaneously establish community standards.

Subject:
Applied Science
Life Science
Physical Science
Social Science
Material Type:
Lesson
Provider:
Open Science Collaboration
Author:
Open Science Collaboration
Date Added:
06/26/2015
Teacher Tools That Integrate Technology: Publishing on the Web
Conditional Remix & Share Permitted
CC BY-SA
Rating
0.0 stars

This article for elementary teachers focuses on three tools that allow educators to publish to the web for free - Instructional Architect, Filamentality, and TeacherTube. Design hotlists, webquests, scrapbooks, and upload video.

Subject:
Applied Science
Environmental Science
History
History, Law, Politics
Material Type:
Lesson Plan
Provider:
Ohio State University College of Education and Human Ecology
Provider Set:
Beyond Penguins and Polar Bears: An Online Magazine for K-5 Teachers
Author:
Kimberly Lightle
Date Added:
10/17/2014
Teaching Data Analysis in the Social Sciences: A case study with article level metrics
Conditional Remix & Share Permitted
CC BY-NC-SA
Rating
0.0 stars

This case study is retrieved from the open book Open Data as Open Educational Resources. Case studies of emerging practice.

Course description:

Metrics and measurement are important strategic tools for understanding the world around us. To take advantage of the possibilities they offer, however, one needs the ability to gather, work with, and analyse datasets, both big and small. This is why metrics and measurement feature in the seminar course Technology and Evolving Forms of Publishing, and why data analysis was a project option for the Technology Project course in Simon Fraser University’s Master of Publishing Program.

The assignment:

“Data Analysis with Google Refine and APIs": Pick a dataset and an API of your choice (Twitter, VPL, Biblioshare, CrossRef, etc.) and combine them using Google Refine. Clean and manipulate your data for analysis. The complexity/messiness of your data will be taken into account”.

Subject:
Applied Science
Information Science
Social Science
Sociology
Material Type:
Case Study
Author:
Alessandra Bordini
Juan Pablo Alperin
Katie Shamash
Date Added:
03/27/2019