Beginning January 2014, Psychological Science gave authors the opportunity to signal open …
Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015, an increase of more than an order of magnitude from baseline. There was no change over time in the low rates of data sharing among comparison journals. Moreover, reporting openness does not guarantee openness. When badges were earned, reportedly available data were more likely to be actually available, correct, usable, and complete than when badges were not earned. Open materials also increased to a weaker degree, and there was more variability among comparison journals. Badges are simple, effective signals to promote open practices and improve preservation of data and materials by using independent repositories.
Access to data is a critical feature of an efficient, progressive and …
Access to data is a critical feature of an efficient, progressive and ultimately self-correcting scientific ecosystem. But the extent to which in-principle benefits of data sharing are realized in practice is unclear. Crucially, it is largely unknown whether published findings can be reproduced by repeating reported analyses upon shared data (‘analytic reproducibility’). To investigate this, we conducted an observational evaluation of a mandatory open data policy introduced at the journal Cognition. Interrupted time-series analyses indicated a substantial post-policy increase in data available statements (104/417, 25% pre-policy to 136/174, 78% post-policy), although not all data appeared reusable (23/104, 22% pre-policy to 85/136, 62%, post-policy). For 35 of the articles determined to have reusable data, we attempted to reproduce 1324 target values. Ultimately, 64 values could not be reproduced within a 10% margin of error. For 22 articles all target values were reproduced, but 11 of these required author assistance. For 13 articles at least one value could not be reproduced despite author assistance. Importantly, there were no clear indications that original conclusions were seriously impacted. Mandatory open data policies can increase the frequency and quality of data sharing. However, suboptimal data curation, unclear analysis specification and reporting errors can impede analytic reproducibility, undermining the utility of data sharing and the credibility of scientific findings.
Psychological science is navigating an unprecedented period of introspection about the credibility …
Psychological science is navigating an unprecedented period of introspection about the credibility and utility of its research. A number of reform initiatives aimed at increasing adoption of transparency and reproducibility-related research practices appear to have been effective in specific contexts; however, their broader, collective impact amidst a wider discussion about research credibility and reproducibility is largely unknown. In the present study, we estimated the prevalence of several transparency and reproducibility-related indicators in the psychology literature published between 2014-2017 by manually assessing these indicators in a random sample of 250 articles. Over half of the articles we examined were publicly available (154/237, 65% [95% confidence interval, 59% to 71%]). However, sharing of important research resources such as materials (26/183, 14% [10% to 19%]), study protocols (0/188, 0% [0% to 1%]), raw data (4/188, 2% [1% to 4%]), and analysis scripts (1/188, 1% [0% to 1%]) was rare. Pre-registration was also uncommon (5/188, 3% [1% to 5%]). Although many articles included a funding disclosure statement (142/228, 62% [56% to 69%]), conflict of interest disclosure statements were less common (88/228, 39% [32% to 45%]). Replication studies were rare (10/188, 5% [3% to 8%]) and few studies were included in systematic reviews (21/183, 11% [8% to 16%]) or meta-analyses (12/183, 7% [4% to 10%]). Overall, the findings suggest that transparent and reproducibility-related research practices are far from routine in psychological science. Future studies can use the present findings as a baseline to assess progress towards increasing the credibility and utility of psychology research.
Registered reports present a substantial departure from traditional publishing models with the …
Registered reports present a substantial departure from traditional publishing models with the goal of enhancing the transparency and credibility of the scientific literature. We map the evolving universe of registered reports to assess their growth, implementation and shortcomings at journals across scientific disciplines.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.