All resources in Scholarly Communication Notebook

Data Analysis and Workflows

(View Complete Item Description)

Understanding the types, processes, and frameworks of workflows and analyses is helpful for researchers seeking to understand more about research, how it was created, and what it may be used for. This lesson uses a subset of data analysis types to introduce reproducibility, iterative analysis, documentation, provenance and different types of processes. Described in more detail are the benefits of documenting and establishing informal (conceptual) and formal (executable) workflows.

Material Type: Lesson

Author: DataONE Community Engagement & Outreach Working Group

Legal & Policy Issues

(View Complete Item Description)

Conversations regarding research data often intersect with questions related to ethical, legal, and policy issues for managing research data. This lesson will define copyrights, licenses, and waivers, discuss ownership and intellectual property, and describe some reasons for data restriction. After completing this lesson, participants will be able to identify ethical, legal, and policy considerations that surround the use and management of research data.

Material Type: Lesson

Author: DataONE Community Engagement & Outreach Working Group

Manage, Improve and Open up your Research and Data

(View Complete Item Description)

This module will look at emerging trends and best practice in data management, quality assessment and IPR issues. We will look at policies regarding data management and their implementation, particularly in the framework of a Research Infrastructure. Learning Outcomes: By the end of this module, you should be able to: - Understand and describe the FAIR Principles and what they are used for - Understand and describe what a Data Management Plan is, and how they are used - Understand and explain what Open Data, Open Access and Open Science means for researchers - Describe best practices around data management - Understand and explain how Research Infrastructures interact with and inform policy on issues around data management You can progress through this module in the order in which we present the various sections. However, this is merely a suggestion as to how you might approach this topic. You might choose to skip certain sections depending on your level of previous knowledge in that area. You can navigate this via the menu on the lefthand side. Each section has a set of resources and tools that you might find useful, as well as a list of items that we recommend for further reading around the subject.

Material Type: Module

Author: PARTHENOS

ZotLog: Inspiring students to adopt structured methods in Zotero

(View Complete Item Description)

The purpose of this activity is to inspire students to adopt structured methods when they explore and retrieve information. It is based on lab notebooking methods and on managing and documenting the flow of references in Zotero, a reference management software. The first principle is based on a tree of collections to manage the references arriving in the Zotero library. Some basic methods are suggested and the students are invited to create their own. The second principle is based on standalone notes to document all the research process through online database, libraries and experts.

Material Type: Activity/Lab, Homework/Assignment

Author: Pascal Martinolli

Everyday Data Management

(View Complete Item Description)

This lesson introduces undergraduates to personal digital archiving (PDA) as an instructional bridge to research data management. PDA is the study of how people organize, maintain, use and share personal digital information in their daily lives. PDA skills closely parallel research data management skills, with the added benefit of being directly relevant to undergraduate students, most of whom manage complex personal digital content on a daily basis. By teaching PDA, librarians encourage authentic learning experiences that immediately resonate with students' day-to-day activities. Teaching PDA builds a foundation of knowledge that not only helps students manage their personal digital materials, but can be translated into research data management skills that will enhance students' academic and professional careers.

Material Type: Lesson

Authors: Ryer Banta, Sara Mannheimer

Research Evaluation Metrics

(View Complete Item Description)

This module dwells on a number of methods (including old and new) available for research evaluation. The module comprises the following four units: Unit 1. Introduction to Research Evaluation Metrics and Related Indicators. Unit 2. Innovations in Measuring Science and Scholarship: Analytical Tools and Indicators in Evaluation Scholarship Communications. Unit 3. Article and Author Level Measurements, and Unit 4. Online Citation and Reference Management Tools. Brief overviews of the units are presented below. Unit 1 encompassed and discussed citation analysis, use of citation-based indicators for research evaluation, common bibliometric indicators, classical bibliometric laws, author level indicators using authors' public profiles, article level metrics using altmetric tools. It is to be noted that author level indicators and article level metrics are new tools for research evaluation. Author level indicators encompasses h index, citations count, i10 index, g index, articles with citation, average citations per article, Eigenfactor score, impact points, and RG score. Article level metrics or altmetrics are based on Twitter, Facebook, Mendeley, CiteULike, and Delicious which have been discussed. All technical terms used in the Unit have been defined. Unit 2 deals with analytical tools and indicators used in evaluating scholarly communications. The tools covered are The Web of Science, Scopus, Indian Citation Index (ICI), CiteSeerX, Google Scholar and Google Scholar Citations. Among these all the tools except Indian Citation Index (ICI) are international in scope. ICI is not very much known outside India. It is a powerful tool as far Indian scholarly literature is concerned. As Indian journals publish a sizable amount of foreign literature, the tool will be useful for foreign countries as well. The analytical products with journal performance metrics Journal Citation Reports (JCR®) has also been described. In the chapter titled New Platforms for Evaluating Scholarly Communications three websites i.e. SCImago Journal & Country Rank (SJR) [ScimagoJR.com], eigenFACTOR.org, JournalMetrics.com and one software called Publish or Perish (POP) Software have been discussed. Article and author level measurements have been discussed in Unit 3. Author and researcher identifiers are absolutely essential for searching databases in the WWW because a name like D Singh can harbour a number of names such as Dan Singh, Dhan Singh, Dhyan Singh, Darbara Singh, Daulat Singh, Durlabh Singh and more. The ResearcherID.com, launched by Thomson Reuters, is a web-based global registry of authors and researchers that individualises each and every name. Open Researcher and Contributor ID (ORCID) is also a registry that uniquely identifies an author or researcher. Both have been discussed in this Unit. Article Level Metrics (Altmetrics) has been treated in this Unit with the discussion as to how altmetrics can be measured with Altmetric.com and ImpactStory.org. Altmetrics for Online Journals has also been touched. There are a number of academic social networks of which ResearchGate.net, Academia.edu, GetCited.org, etc. have been discussed. Regional journal networks with bibliometric indicators are also in existence. Two networks of this type such as SciELO – Scientific Electronic Library Online, and Redalyc have been dealt with. The last unit (Unit 4) is on online citation and reference management tools. The tools discussed are Mendeley, CiteULike, Zotero, Google Scholar Library, and EndNote Basic. The features of all the management tools have been discussed with figures, tables, and text boxes. This is Module Four of the UNESCO's Open Access Curriculum for Researchers. Full-Text is available at http://unesdoc.unesco.org/images/0023/002322/232210E.pdf

Material Type: Full Course, Module, Textbook, Unit of Study

Author: Anup Kumar Das

Meaningful Metrics: A 21st-Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact

(View Complete Item Description)

What does it mean to have meaningful metrics in today’s complex higher education landscape? With a foreword by Heather Piwowar and Jason Priem, this highly engaging and activity-laden book serves to introduce readers to the fast-paced world of research metrics from the unique perspective of academic librarians and LIS practitioners. Starting with the essential histories of bibliometrics and altmetrics, and continuing with in-depth descriptions of the core tools and emerging issues at stake in the future of both fields, Meaningful Metrics is a convenient all-in-one resource that is designed to be used by a range of readers, from those with little to no background on the subject to those looking to become movers and shakers in the current scholarly metrics movement. Authors Borchardt and Roemer, offer tips, tricks, and real-world examples illustrate how librarians can support the successful adoption of research metrics, whether in their institutions or across academia as a whole.

Material Type: Textbook

Authors: Rachel Borchardt, Robin Chin Roemer

Finding Impact Factor and Other Journal-Level Metrics

(View Complete Item Description)

Get an overview of journal-level bibliometrics such as Journal Impact Factor, CiteScore, Eigenfactor Score, and others. Find out how they are calculated and where they can be found! Recommended for faculty, graduate students, post-doctorates, or anyone interested in scholarly publications. For a self-graded quiz and Certificate of Completion, go to https://bit.ly/scs-quiz1 More information about journal-level metrics: https://https://bit.ly/scs-impact-find

Material Type: Lecture

Author: Kristy Padron

San Francisco Declaration on Research Assessment

(View Complete Item Description)

The Declaration on Research Assessment (DORA) recognizes the need to improve the ways in which the outputs of scholarly research are evaluated. The declaration was developed in 2012 during the Annual Meeting of the American Society for Cell Biology in San Francisco. It has become a worldwide initiative covering all scholarly disciplines and all key stakeholders including funders, publishers, professional societies, institutions, and researchers. The DORA initiative encourages all individuals and organizations who are interested in developing and promoting best practice in the assessment of scholarly research to sign DORA. Other resources are available on their website, such as case studies of universities and national consortia that demonstrate key elements of institutional change to improve academic career success.

Material Type: Reading

Author: American Society for Cell Biology

Narrative CV: resources to help you write one

(View Complete Item Description)

This 25-min course, from the University of Glasgow looks at: the thinking behind a move towards narrative CV and assessment formats; how the research landscape and research assessment practices are evolving and efforts to develop fairer assessment approaches; advice and tips on what to include in a more narrative format; and examples from real narrative CVs, written by early-career researchers. This course is directed at early-career researchers, specifically those who are making use of the Resume for Researchers format (e.g., via the UK Research and Innovation (UKRI), which is a non-departmental public body of the Government of the United Kingdom that directs research and innovation funding). Many funding agencies, the industry and corporate sector, and universities now require a more narrative-style CV to incorporate qualitative aspects into job applications (e.g. particularly in relation to describing input to publications, and the significance of these). The goal of these formats is to help researchers to share their varied contributions to research in a consistent way and across a wide range of career paths and personal circumstances, and to move away from relying on narrowly focused performance indicators that can make it harder to assess, reward or nurture the full range of contributions that a researcher or academic makes to their field or discipline. This course helps researchers to structure, write, and craft a narrative CV to highlight and emphasize their individual academic accomplishments, contributions with a particular emphasis on 'how' they contributed rather than only 'what' they contribute.

Material Type: Module, Reading

Author: Lab for Academic Culture at the University of Glasgow

IATUL Research Impact Things – A self-paced training program for IATUL libraries

(View Complete Item Description)

The programme aims to equip learners with the skills and knowledge required to engage in the use of a range of metrics around research impact and gain understanding of the research landscape. This is a flexible programme – you can do as much or as little as suits you. While some Things are interlinked, each of the Things is designed to be completed separately, in any order and at any level of complexity. Choose your own adventure! There are three levels for each Thing: Getting started is for you if you are just beginning to learn about each topic Learn more is if you know a bit but want to know more Challenge me is often more in-depth or assumes that you are familiar with at least the basics of each topic

Material Type: Lesson, Module, Reading

Author: IATUL Special Interest Group Metrics and Research Impact (SIG-MaRI)

Counting what counts in recruitment, promotion and tenure (Open Access Week 2020 Keynote Event)

(View Complete Item Description)

Virginia Tech's Open Access Week 2020 keynote speaker, Elizabeth (Lizzie) Gadd, Research Policy Manager (Publications) at Loughborough University in the UK, gives a talk about how what we reward through recruitment, promotion and tenure processes is not always what we actually value about research activity. The talk explores how we can pursue value-led evaluations - and how we can persuade senior leaders of their benefits. The keynote talk is followed by a panel discussion with faculty members at Virginia Tech: Thomas Ewing (Associate Dean for Graduate Studies and Research and Professor of History), Carla Finkielstein (Associate Professor of Biological Sciences), Bikrum Gill (Assistant Professor of Political Science), and Sylvester Johnson (Professor and Director of the Center for Humanities. The panel is moderated by Tyler Walters (Dean, University Libraries). The slides from this presentation are in Loughborough University's repository under a CC BY-NC-SA 4.0 license. https://repository.lboro.ac.uk/articles/presentation/Counting_what_counts_in_recruitment_promotion_and_tenure/13113860

Material Type: Lecture

Authors: Bikrum Singh Gill, Carla Finkielstein, Elizabeth Gadd, Rachel Miles, Sylvester Johnson, Tom Ewing, Tyler Walters

Metrics Toolkit

(View Complete Item Description)

The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to help scholars and evaluators understand and use citations, web metrics, and altmetrics responsibly in the evaluation of research. The Metrics Toolkit provides evidence-based information about research metrics across disciplines, including how each metric is calculated, where you can find it, and how each should (and should not) be applied. You’ll also find examples of how to use metrics in grant applications, CV, and promotion packages.

Material Type: Reading

Authors: Heather Coates, Metrics Toolkit Editorial Board, Robin Champieux, Stacy Konkiel

The Metric Tide: Review of Metrics in Research Assessment

(View Complete Item Description)

This UK report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises.

Material Type: Reading, Textbook

Authors: Ben Johnson, Eleonora Befiore, Ian Viney, James Wilsdon, Jane Tinkler, Jude Hill, Liz Allen, Mike Thelwall, Paul Wouters, Philip Campbell, Richard Jones, Roger Kain, Simon Richard Kerridge, Stephen Curry, Steven Hill

Using InCites responsibly: a guide to interpretation and good practice

(View Complete Item Description)

This guide has been created by bibliometric practitioners to support other users of InCites, a research analytics tool from Clarivate Analytics that uses bibliographic data from Web of Science; the guide promotes a community of informed and responsible use of research impact metrics. The recommendations in this document may be more suited to other academic sector users, but the authors hope that other users may also benefit from the suggestions. The guide aims to provide plain-English definitions, key strengths and weaknesses and some practical application tips for some of the most commonly-used indicators available in InCites. The indicator definitions are followed by explanations of the data that powers InCites, attempting to educate users on where the data comes from and how the choices made in selecting and filtering data will impact on final results. Also in this document are a comparative table to highlight differences between indicators in InCites and SciVal, another commonly used bibliometric analytic programme, and instructions on how to run group reports. All of the advice in this document is underpinned by a belief in the need to use InCites in a way that respects the limitations of indicators as quantitative assessors of research outputs. Both of the authors are members of signatory institutions of DORA, the San Francisco Declaration on Research Assessment. A summary of advice to using indicators and bibliometric data responsibly is available on pages 4-5 and should be referred to throughout. Readers are also recommended to refer to the official InCites Indicators Handbook produced by Clarivate Analytics. The guide was written with complete editorial independence from Clarivate Analytics, the owners of InCites. Clarivate Analytics supported the authors of this document with checking for factual accuracy only.

Material Type: Reading

Authors: Gray A, Price R

Using SciVal responsibly: a guide to interpretation and good practice

(View Complete Item Description)

This guide is designed to help those who use SciVal, a research analytics tool from Elsevier that sources bibliographic data from Scopus, to source and apply bibliometrics in academic institutions. It was originally devised in February 2018 by Dr. Ian Rowlands of King’s College London as a guide for his university, which makes SciVal widely available to its staff. King’s does this because it believes that bibliometric data are best used in context by specialists in the field. A small group of LIS-Bibliometrics committee members reviewed and revised the King’s guide to make it more applicable to a wider audience. SciVal is a continually updated source and so feedback is always welcome at LISBibliometrics@jiscmail.ac.uk. LIS-Bibliometrics is keen that bibliometric data should be used carefully and responsibly and this requires an understanding of the strengths and limitations of the indicators that SciVal publishes. The purpose of this Guide is to help researchers and professional services staff to make the most meaningful use of SciVal. It includes some important `inside track’ insights and practical tips that may not be found elsewhere. The scope and coverage limitations of SciVal are fairly widely understood and serve as a reminder that these metrics are not appropriate in fields where scholarly communication takes place mainly outside of the journals and conference literature. This is one of the many judgment calls that need to be made when putting bibliometric data into their proper context. One of the most useful features of SciVal is the ability to drill down in detail using various filters. This allows a user to define a set of publications accurately, but that may mean generating top level measures that are based on small samples with considerable variance. Bibliometrics distributions are often highly skewed, where even apparently simple concepts like the `average’ can be problematic. So one objective of this Guide is to set out some advice on sample sizes and broad confidence intervals, to avoid over-interpreting the headline data. Bibliometric indicators should always be used in combination, not in isolation, because each can only offer partial insights. They should also be used in a 'variable geometry' along with other quantitative and qualitative indicators, including expert judgments and non-publication metrics, such as grants or awards, to flesh out the picture.

Material Type: Reading

Authors: Elizabeth Gadd, Ian Rowlands, LIS-Bibliometrics Committee

Increasing visibility and discoverability of scholarly publications with academic search engine optimization

(View Complete Item Description)

Journal article abstract: With the help of academic search engine optimization (ASEO), publications can more easily be found in academic search engines and databases. Authors can improve the ranking of their publications by adjusting titles, keywords and abstracts. Carefully considered wording makes publications easier to find and, ideally, cited more often. This article is meant to support authors in making their scholarly publications more visible. It provides basic information on ranking mechanisms as well as tips and tricks on how to improve the findability of scholarly publications while also pointing out the limits of optimization. This article, authored by three scholarly communications librarians, draws on their experience of hosting journals, providing workshops for researchers and individual publication support, as well as on their investigations of the ranking algorithms of search engines and databases.

Material Type: Reading

Authors: Christian Kaier, Karin Lackner, Lisa Schilhan