All resources in Aging Science

Open Science is for Aging Research, Too

(View Complete Item Description)

In response to concerns about the replicability of published research, some disciplines have used open science practices to try to enhance the credibility of published findings. Gerontology has been slow to embrace these changes. We argue that open science is important for aging research, both to reduce questionable research practices that may also be prevalent in the field (such as too many reported significant age differences in the literature, underpowered studies, hypothesizing after the results are known, and lack of belief updating when findings do not support theories), as well as to make research in the field more transparent overall. To ensure the credibility of gerontology research moving forward, we suggest concrete ways to incorporate open science into gerontology research: for example, by using available preregistration templates adaptable to a variety of study designs typical for aging research (even secondary analyses of existing data). Larger sample sizes may be achieved by many-lab collaborations. Though using open science practices may make some aspects of gerontology research more challenging, we believe that gerontology needs open science to ensure credibility now and in the future.

Material Type: Reading

Authors: Isaacowitz Derek M, Lind Majse

Dementia, Big Data and Open Science

(View Complete Item Description)

Although there is clear potential to improve science and innovation systems through big data and open science, barriers still remain with respect to data sharing efforts. How can the available massive and diverse data collections be used and shared more efficiently to boost global research and innovation and improve care? What actions are needed to facilitate open access to research data generated with public funding? The OECD is bringing together policy makers, funding agencies and researchers to tackle the issue of open access to data, focused around developing good practice and principles on data governance. Four case studies highlight best practice and identify barriers to progress. Following an OECD-hosted consultation with the Ontario Brain Institute (OBI), the United Kingdom Medical Research Council (MRC), and the US Alzheimer’s Association, two concrete examples of global data sharing have been created. The first, focused on providing a wealth of open-source biomedical data for the community (deep data), builds upon GAAIN, the Global Alzheimer’s Association Interactive Network, and links eleven international partners through a federated network of data resources. The capability of this network is being extended significantly through connections with the French National Alzheimer’s Database (BNA), the European Medicines Informatics Framework (EMIF), and the Canadian based Longitudinal Online Research and Imaging System (LORIS). The second focused on linking big data approaches at the population level (broad data), is a complementary collaboration between the Canadian Consortium on Neurodegeneration in Ageing and the Dementias Platform UK to share and analyse large-scale complex population-wide datasets from up to 2 million individuals, including imaging, genomics and health data. As a result, these collaborations will enable the aggregation of an unprecedented volume of individual and population-level data, offering an open science solution to help research to more efficiently tackle Alzheimer’s disease and related disorders.

Material Type: Reading

Author: OECD

THE ROLE OF REPLICATION RESEARCH IN ADVANCING GERONTOLOGICAL SCIENCE: TRAJECTORIES, TRANSITIONS, AND TYPOLOGIES

(View Complete Item Description)

The analysis of longitudinal observational data can take many forms and requires many decisions, with research findings and conclusions often found to differ across independent longitudinal studies addressing the same question. Differences in measurements, sample composition (e.g., age, cohort, country/culture), and statistical models (e.g., change/time function, covariate set, centering, treatment of incomplete data) can affect the replicability of results. The central aim of the Integrative Analysis of Longitudinal Studies of Aging (IALSA) research network (NIH/NIA P01AG043362) is to optimize opportunities for replication and cross-validation across heterogeneous sources of longitudinal data by evaluating comparable conceptual and statistical models at the construct-level. We will provide an overview of the methodological challenges associated with comparative longitudinal and international research, including the comparability of alternative models of change, measurement harmonization and construct-level comparison, retest effects, distinguishing and contrasting between-person and within-person effects across studies, and evaluation of alternative models for change over time. These methodological challenges and recommended approaches will be discussed within the context of reproducible and replication research focused on longitudinal studies.

Material Type: Reading

Author: Hofer S M

7 Easy Steps to Open Science: An Annotated Reading List

(View Complete Item Description)

The Open Science movement is rapidly changing the scientific landscape. Because exact definitions are often lacking and reforms are constantly evolving, accessible guides to open science are needed. This paper provides an introduction to open science and related reforms in the form of an annotated reading list of seven peer-reviewed articles, following the format of Etz et al. (2018). Written for researchers and students - particularly in psychological science - it highlights and introduces seven topics: understanding open science; open access; open data, materials, and code; reproducible analyses; preregistration and registered reports; replication research; and teaching open science. For each topic, we provide a detailed summary of one particularly informative and actionable article and suggest several further resources. Supporting a broader understanding of open science issues, this overview should enable researchers to engage with, improve, and implement current open, transparent, reproducible, replicable, and cumulative scientific practices.

Material Type: Reading

Authors: Alexander Etz, Amy Orben, Hannah Moshontz, Jesse Niebaum, Johnny van Doorn, Matthew Makel, Michael Schulte-Mecklenbeck, Sam Parsons, Sophia Crüwell

OSF101

(View Complete Item Description)

This webinar walks you through the basics of creating an OSF project, structuring it to fit your research needs, adding collaborators, and tying your favorite online tools into your project structure. OSF is a free, open source web application built by the Center for Open Science, a non-profit dedicated to improving the alignment between scientific values and scientific practices. OSF is part collaboration tool, part version control software, and part data archive. It is designed to connect to popular tools researchers already use, like Dropbox, Box, Github, and Mendeley, to streamline workflows and increase efficiency.

Material Type: Lecture

Author: Center for Open Science

An Introduction to Registered Reports for the Research Funder Community

(View Complete Item Description)

In this webinar, Doctors David Mellor (Center for Open Science) and Stavroula Kousta (Nature Human Behavior) discuss the Registered Reports publishing workflow and the benefits it may bring to funders of research. Dr. Mellor details the workflow and what it is intended to do, and Dr. Kousta discusses the lessons learned at Nature Human Behavior from their efforts to implement Registered Reports as a journal.

Material Type: Lecture

Author: Center for Open Science

Introduction to Preprints

(View Complete Item Description)

This is a recording of a 45 minute introductory webinar on preprints. With our guest speaker Philip Cohen, we’ll cover what preprints/postprints are, the benefits of preprints, and address some common concerns researcher may have. We’ll show how to determine whether you can post preprints/postprints, and also demonstrate how to use OSF preprints (https://osf.io/preprints/) to share preprints. The OSF is the flagship product of the Center for Open Science, a non-profit technology start-up dedicated to improving the alignment between scientific values and scientific practices. Learn more at cos.io and osf.io, or email contact@cos.io.

Material Type: Lecture

Author: Center for Open Science

Dissemination and publication of research findings: an updated review of related biases

(View Complete Item Description)

Objectives To identify and appraise empirical studies on publication and related biases published since 1998; to assess methods to deal with publication and related biases; and to examine, in a random sample of published systematic reviews, measures taken to prevent, reduce and detect dissemination bias. Data sources The main literature search, in August 2008, covered the Cochrane Methodology Register Database, MEDLINE, EMBASE, AMED and CINAHL. In May 2009, PubMed, PsycINFO and OpenSIGLE were also searched. Reference lists of retrieved studies were also examined. Review methods In Part I, studies were classified as evidence or method studies and data were extracted according to types of dissemination bias or methods for dealing with it. Evidence from empirical studies was summarised narratively. In Part II, 300 systematic reviews were randomly selected from MEDLINE and the methods used to deal with publication and related biases were assessed. Results Studies with significant or positive results were more likely to be published than those with non-significant or negative results, thereby confirming findings from a previous HTA report. There was convincing evidence that outcome reporting bias exists and has an impact on the pooled summary in systematic reviews. Studies with significant results tended to be published earlier than studies with non-significant results, and empirical evidence suggests that published studies tended to report a greater treatment effect than those from the grey literature. Exclusion of non-English-language studies appeared to result in a high risk of bias in some areas of research such as complementary and alternative medicine. In a few cases, publication and related biases had a potentially detrimental impact on patients or resource use. Publication bias can be prevented before a literature review (e.g. by prospective registration of trials), or detected during a literature review (e.g. by locating unpublished studies, funnel plot and related tests, sensitivity analysis modelling), or its impact can be minimised after a literature review (e.g. by confirmatory large-scale trials, updating the systematic review). The interpretation of funnel plot and related statistical tests, often used to assess publication bias, was often too simplistic and likely misleading. More sophisticated modelling methods have not been widely used. Compared with systematic reviews published in 1996, recent reviews of health-care interventions were more likely to locate and include non-English-language studies and grey literature or unpublished studies, and to test for publication bias. Conclusions Dissemination of research findings is likely to be a biased process, although the actual impact of such bias depends on specific circumstances. The prospective registration of clinical trials and the endorsement of reporting guidelines may reduce research dissemination bias in clinical research. In systematic reviews, measures can be taken to minimise the impact of dissemination bias by systematically searching for and including relevant studies that are difficult to access. Statistical methods can be useful for sensitivity analyses. Further research is needed to develop methods for qualitatively assessing the risk of publication bias in systematic reviews, and to evaluate the effect of prospective registration of studies, open access policy and improved publication guidelines.

Material Type: Reading

Authors: Aj Sutton, C Hing, C Pang, Cs Kwok, F Song, I Harvey, J Ryder, L Hooper, S Parekh, Yk Loke

Secondary Data Preregistration

(View Complete Item Description)

Preregistration is the process of specifying project details, such as hypotheses, data collection procedures, and analytical decisions, prior to conducting a study. It is designed to make a clearer distinction between data-driven, exploratory work and a-priori, confirmatory work. Both modes of research are valuable, but are easy to unintentionally conflate. See the Preregistration Revolution for more background and recommendations. For research that uses existing datasets, there is an increased risk of analysts being biased by preliminary trends in the dataset. However, that risk can be balanced by proper blinding to any summary statistics in the dataset and the use of hold out datasets (where the "training" and "validation" datasets are kept separate from each other). See this page for specific recommendations about "split samples" or "hold out" datasets. Finally, if those procedures are not followed, disclosure of possible biases can inform the researcher and her audience about the proper role any results should have (i.e. the results should be deemed mostly exploratory and ideal for additional confirmation). This project contains a template for creating your preregistration, designed specifically for research using existing data. In the future, this template will be integrated into the OSF.

Material Type: Reading

Authors: Alexander C. DeHaven, Andrew Hall, Brian Brown, Charles R. Ebersole, Courtney K. Soderberg, David Thomas Mellor, Elliott Kruse, Jerome Olsen, Jessica Kosie, K. D. Valentine, Lorne Campbell, Marjan Bakker, Olmo van den Akker, Pamela Davis-Kean, Rodica I. Damian, Sara J. Weston, Stuart J. Ritchie, Thuy-vy Ngugen, William J. Chopik

Transparency and Open Science Symposium GSA 2019

(View Complete Item Description)

The past decade has seen rapid growth in conversations around and progress towards fostering a more transparent, open, and cumulative science. Best practices are being codified and established across fields relevant to gerontology from cancer science to psychological science. Many of the areas currently under development are of particular relevance to gerontologists such as best practices in balancing open science with participant confidentiality or best practices for preregistering archival, longitudinal data analysis. The present panel showcases one of the particular strengths of the open science movement - the contribution that early career researchers are making to these ongoing conversations on best practices. Early career researchers have the opportunity to blend their expertise with technology, their knowledge of their disciplines, and their vision for the future in shaping these conversations. In this panel, three early career researchers share their insights. Pfund presents an introduction to preregistration and the value of preregistration from the perspective of “growing up” within the open science movement. Seaman discusses efforts in and tools for transparency and reproducibility in neuroimaging of aging research. Ludwig introduces the idea of registered reports as a particularly useful form of publication for researchers who use longitudinal methods and/or those who work with hard-to-access samples. The symposium will include time for the audience to engage the panel in questions and discussion about current efforts in and future directions for transparent, open, and cumulative science efforts in gerontology.

Material Type: Reading

Authors: Eileen K Graham, Gabrielle N, Jennifer Lodi-smith, Kendra Leigh Seaman, Rita M

Equivalence Testing for Psychological Research: A Tutorial

(View Complete Item Description)

Psychologists must be able to test both for the presence of an effect and for the absence of an effect. In addition to testing against zero, researchers can use the two one-sided tests (TOST) procedure to test for equivalence and reject the presence of a smallest effect size of interest (SESOI). The TOST procedure can be used to determine if an observed effect is surprisingly small, given that a true effect at least as extreme as the SESOI exists. We explain a range of approaches to determine the SESOI in psychological science and provide detailed examples of how equivalence tests should be performed and reported. Equivalence tests are an important extension of the statistical tools psychologists currently use and enable researchers to falsify predictions about the presence, and declare the absence, of meaningful effects.

Material Type: Reading

Authors: Anne Scheel, Daniel Lakens, Peder Isager

Aging Research and Open Science Supplemental Reading List

(View Complete Item Description)

Open science practices are broadly applicable within the field of aging research. Across study types, these practices carry the potential to influence changes within research practices in aging science that can improve the integrity and reproducibility of studies. Resources on open science practices in aging research can, however, be challenging to discover due to the breadth of aging research and the range of resources available on the subject. By accumulating resources on open science and aging research and compiling them in a centralized location, we hope to facilitate the discoverability and use of these resources among researchers who study aging, and among any other interested parties.  Unfortunately, not all resources are openly available. The following list of resources, while not open access, provide valuable perspectives, information, and insight into the open science movement and its place in aging research. 

Material Type: Reading

Author: Olivia Lowrey