This collection contains materials pertaining to scholarly metrics and scholarly identity, including traditional and alternative metrics, academic social media, and related issues.
In 2016 the LIS-Bibliometrics Forum commissioned the development of a set of …
In 2016 the LIS-Bibliometrics Forum commissioned the development of a set of bibliometric competencies (2017 Model), available at https://thebibliomagician.wordpress.com/2017-competencies-archived/. The work, sponsored by a small research grant from Elsevier Research Intelligence Division, was led by Dr. Andrew Cox at the University of Sheffield, and Dr. Sabrina Petersohn of the Bergische Universität Wuppertal, Germany. The aim of the competency statements was to ensure that bibliometric practitioners were equipped to do their work responsibly and well.
The Competency Model was updated in July 2021 and includes a colour gradient to reflect the Levels and how they build upon one another. In particular, the 2021 competencies can help:
To identify skills gaps To support progression through career stages for practitioners in the field of bibliometrics To prepare job descriptions
The work underpinning the paper is available here: http://journals.sagepub.com/doi/abs/10.1177/0961000617728111. It is intended that the competencies are a living document and will be reviewed over time.
Being active in social media, like in Twitter and Blogs, is one …
Being active in social media, like in Twitter and Blogs, is one way to reach a larger audience and to enhance a researcher’s impact. Other researchers will learn about their findings through these additional channels and in addition the public, policy makers, and the press. The toolkit shows several ways of how to get in touch with other researchers and discuss findings at an early stage in research networks, conferences, and in social media. It presents open tools for co-writing, online meetings, reference- and project management.
Introduce graduate students and faculty in any discipline to the world of …
Introduce graduate students and faculty in any discipline to the world of altmetrics and new ways to evaluate engagement with scholarly publications. The intention is to show not only new measurement techniques but to walk the learners through how to present their work in venues that will increase the visibility of their ideas and scholarly output.
This short course provides training materials about how to create a set …
This short course provides training materials about how to create a set of publication data, gather additional information about the data through an API (Application Programming Interface), clean the data, and analyze the data in various ways. Developing these skills will assist academic librarians who are:
Negotiating a renewal of a journal package or an open access publishing agreement, Interested in which journals the institution's authors published in or which repositories the institution’s authors shared their works in, Looking to identify publications that could be added to your repository, Searching for authors who do or do not publish OA for designing outreach programs, or Tracking how open access choices have changed over time. After completing the lessons, the user will be able to gain an understanding of an institution’s publishing output, such as number of publications per year, open access status of the publications, major funders of the research, estimates of how much funding might be spent towards article processing charges (APCs), and more. The user will also be better prepared to think critically about institutional publishing data to make sustainable and values-driven scholarly communications decisions.
The course is presented in two sections. Section 1 describes how to build a dataset. Section 2 describes a free, open source tool for working with data. Examples of how to do analyses both in OpenRefine and Microsoft Excel are provided.
This short course was created for the Scholarly Communication Notebook. The file "Analyzing Institutional Publishing Output-A Short Course.docx" serves as a table of contents for the materials.
Created as a supplement for the Impact Measurement collection of the ScholarlyCommunication …
Created as a supplement for the Impact Measurement collection of the ScholarlyCommunication Notebook (SCN) to describe some of the core literature in the field as well asresources that cannot be included on the SCN, because they are not openly licensed but arefree to read.This annotated bibliography is separated into three sections: Peer reviewed scholarly articles,Blog posts, initiatives, and guides, and Resources for further education and professionaldevelopment. The first section is intended to help practitioners in the field of researchassessment and bibliometrics to understand high-level core concepts in the field. The secondsection offers resources that are more applicable to practice. The final section includes links toblogs, communities, discussion lists, paid and free educational courses, and archivedconferences, so that practitioners and professionals can stay abreast of emerging trends,improve their skills, and find community. Most of these resources could not be included on theScholarly Communication Notebook, because they are not openly licensed. However, allresources on this bibliography are freely available to access and read.
Slides from the Keynote talk given at Virginia Tech Open Access Week …
Slides from the Keynote talk given at Virginia Tech Open Access Week on 20 October 2020. See the full presentation recording and panel discussion at https://vtechworks.lib.vt.edu/handle/10919/100682.
Virginia Tech's Open Access Week 2020 keynote speaker, Elizabeth (Lizzie) Gadd, Research …
Virginia Tech's Open Access Week 2020 keynote speaker, Elizabeth (Lizzie) Gadd, Research Policy Manager (Publications) at Loughborough University in the UK, gives a talk about how what we reward through recruitment, promotion and tenure processes is not always what we actually value about research activity. The talk explores how we can pursue value-led evaluations - and how we can persuade senior leaders of their benefits.
The keynote talk is followed by a panel discussion with faculty members at Virginia Tech: Thomas Ewing (Associate Dean for Graduate Studies and Research and Professor of History), Carla Finkielstein (Associate Professor of Biological Sciences), Bikrum Gill (Assistant Professor of Political Science), and Sylvester Johnson (Professor and Director of the Center for Humanities. The panel is moderated by Tyler Walters (Dean, University Libraries).
The slides from this presentation are in Loughborough University's repository under a CC BY-NC-SA 4.0 license. https://repository.lboro.ac.uk/articles/presentation/Counting_what_counts_in_recruitment_promotion_and_tenure/13113860
The course (supported by the European Union's Horizon 2020 Programme and Seventh …
The course (supported by the European Union's Horizon 2020 Programme and Seventh Framework Programme) will help understand and justify the importance of public engagement as a key dimension of responsible research and innovation and open science. It provides tools to design, implement and assess a public engagement strategy within research funding and performing organizations.
Upon completion of this course, you will be able to:
Understand what is public engagement for RRI and Open Science. Assess the level of engagement that your current R&I practice promotes. Understand the importance of public engagement for RRI and Open Science. Be aware of tools, resources and skills needed to start and implement public engagement processes.
Get an overview of journal-level bibliometrics such as Journal Impact Factor, CiteScore, …
Get an overview of journal-level bibliometrics such as Journal Impact Factor, CiteScore, Eigenfactor Score, and others. Find out how they are calculated and where they can be found! Recommended for faculty, graduate students, post-doctorates, or anyone interested in scholarly publications.
For a self-graded quiz and Certificate of Completion, go to https://bit.ly/scs-quiz1
More information about journal-level metrics: https://https://bit.ly/scs-impact-find
This review was commissioned by the joint UK higher education (HE) funding …
This review was commissioned by the joint UK higher education (HE) funding bodies as part of the Future Research Assessment Programme (FRAP). It revisits the findings of the 2015 review The Metric Tide to take a fresh look at the use of indicators in research management and assessment.
While this review feeds into the larger FRAP process, the authors have taken full advantage of their independence and sought to stimulate informed and robust discussion about the options and opportunities of future REF exercises. The report should be read in that spirit: as an input to ongoing FRAP deliberations, rather than a reflection of their likely or eventual conclusions.
The report is written in three sections. Section 1 plots the development of the responsible research assessment agenda since 2015 with a focus on the impact of The Metric Tide review and progress against its recommendations. Section 2 revisits the potential use of metrics and indicators in any future REF exercise, and proposes an increased uptake of ‘data for good’. Section 3 considers opportunities to further support the roll-out of responsible research assessment policies and practices across the UK HE sector. Appendices include an overview of progress against the recommendations of The Metric Tide and a literature review.
Abstract For knowledge to benefit research and society, it must be trustworthy. …
Abstract For knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous, and transparent at all stages of design, execution, and reporting. Assessment of researchers still rarely includes considerations related to trustworthiness, rigor, and transparency. We have developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded for behaviors that strengthen research integrity. We present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle, we provide a rationale for its inclusion and provide examples where these principles are already being adopted.
HuMetricsHSS supports the creation of values-based frameworks to guide all kinds of …
HuMetricsHSS supports the creation of values-based frameworks to guide all kinds of scholarly process, and to promote the nurturing of a values-enacted approach to academia writ large. During the 2016 Triangle Scholarly Communication Institute (SCI), the authors sketched a preliminary set of core values for enriching scholarship, highlighting five: Equity, Openness, Collegiality, Quality, Community. They created a framework which is intended to help transform how scholarship is created, assessed, and valued in the humanities.
At the workshops and in the toolkit, they emphasize that values are locally negotiated and frameworks locally built. That’s the explicit point of the workshop, to make space for open conversation about values and their meaning, to come to agreement on what matters for a given group, and then to work on constructing a framework that could be used to guide evaluation in the academy — whether that’s through the tenure and promotion process, the setting of annual goals, the hiring of new faculty, or decision-making about what kinds of digitization projects to take on, what kinds of collections to develop, or what kinds of projects to publish at an academic press.
The programme aims to equip learners with the skills and knowledge required …
The programme aims to equip learners with the skills and knowledge required to engage in the use of a range of metrics around research impact and gain understanding of the research landscape. This is a flexible programme – you can do as much or as little as suits you. While some Things are interlinked, each of the Things is designed to be completed separately, in any order and at any level of complexity. Choose your own adventure!
There are three levels for each Thing:
Getting started is for you if you are just beginning to learn about each topic Learn more is if you know a bit but want to know more Challenge me is often more in-depth or assumes that you are familiar with at least the basics of each topic
Journal article abstract: With the help of academic search engine optimization (ASEO), …
Journal article abstract: With the help of academic search engine optimization (ASEO), publications can more easily be found in academic search engines and databases. Authors can improve the ranking of their publications by adjusting titles, keywords and abstracts. Carefully considered wording makes publications easier to find and, ideally, cited more often. This article is meant to support authors in making their scholarly publications more visible. It provides basic information on ranking mechanisms as well as tips and tricks on how to improve the findability of scholarly publications while also pointing out the limits of optimization. This article, authored by three scholarly communications librarians, draws on their experience of hosting journals, providing workshops for researchers and individual publication support, as well as on their investigations of the ranking algorithms of search engines and databases.
This introductory course from the FOSTER Consortium (supported by the European Union's …
This introductory course from the FOSTER Consortium (supported by the European Union's Seventh Framework Programme for research, technological development and demonstration and the European Union's Horizon 2020 programme) will help you to understand what Responsible Research & Innovation (RRI) means, where it has come from, and why it can introduce an important and beneficial shift in relations between research, innovation and citizens.
Upon completing the course you will:
Understand what RRI means Understand the reasons why the term RRI and related practices have emerged Know about opportunities RRI can provide & obstacles you may face Know the basics of how to start practicing RRI as a researcher and as an institution/industry
What does it mean to have meaningful metrics in today’s complex higher …
What does it mean to have meaningful metrics in today’s complex higher education landscape? With a foreword by Heather Piwowar and Jason Priem, this highly engaging and activity-laden book serves to introduce readers to the fast-paced world of research metrics from the unique perspective of academic librarians and LIS practitioners. Starting with the essential histories of bibliometrics and altmetrics, and continuing with in-depth descriptions of the core tools and emerging issues at stake in the future of both fields, Meaningful Metrics is a convenient all-in-one resource that is designed to be used by a range of readers, from those with little to no background on the subject to those looking to become movers and shakers in the current scholarly metrics movement. Authors Borchardt and Roemer, offer tips, tricks, and real-world examples illustrate how librarians can support the successful adoption of research metrics, whether in their institutions or across academia as a whole.
This UK report presents the findings and recommendations of the Independent Review …
This UK report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises.
The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to …
The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to help scholars and evaluators understand and use citations, web metrics, and altmetrics responsibly in the evaluation of research.
The Metrics Toolkit provides evidence-based information about research metrics across disciplines, including how each metric is calculated, where you can find it, and how each should (and should not) be applied. You’ll also find examples of how to use metrics in grant applications, CV, and promotion packages.
This resource links to the full course (all 13 weeks of modules) …
This resource links to the full course (all 13 weeks of modules) on the Internet Archive. The video lectures for the courses are also available on YouTube at https://www.youtube.com/watch?v=maRP_Wvc4eY&list=PLWYwQdaelu4en5MZ0bbg-rSpcfb64O_rd
This series was designed and taught by Chris Belter, Ya-Ling Lu, and Candace Norton at the NIH Library. It was originally presented in weekly installments to NIH Library staff from January-May 2019 and adapted for web viewing later the same year.
The goal of the series is to provide free, on-demand training on how we do bibliometrics for research evaluation. Although demand for bibliometric indicators and analyses in research evaluation is growing, broadly available and easily accessible, training on how to provide those analyses is scarce. We have been providing bibliometric services for years, and we wanted to share our experience with others to facilitate the broader adoption of accurate and responsible bibliometric practice in research assessment. We hope this series acts as a springboard for others to get started with bibliometrics so that they feel more comfortable moving beyond this series on their own.
Navigating the Series The training series consists of 13 individual courses, organized into 7 thematic areas. Links to each course in the series are provided on the left. Each course includes a training video with audio transcription, supplemental reading to reinforce the concepts introduced in the course, and optional practice exercises.
We recommend that the courses be viewed in the order in which they are listed. The courses are listed in the same order as the analyses that we typically perform to produce one of our standard reports. Many of the courses also build on concepts introduced in previous courses, and may be difficult to understand if viewed out of order. We also recommend that the series be taken over the course of 13 consecutive weeks, viewing one course per week. A lot is covered in these courses, so it is a good idea to take your time with them to make sure you understand each course before moving on to the next. We also recommend you try to complete the practice exercises that accompany many of the courses, because the best way to learn bibliometrics is by doing it.
This 25-min course, from the University of Glasgow looks at: the thinking …
This 25-min course, from the University of Glasgow looks at: the thinking behind a move towards narrative CV and assessment formats; how the research landscape and research assessment practices are evolving and efforts to develop fairer assessment approaches; advice and tips on what to include in a more narrative format; and examples from real narrative CVs, written by early-career researchers. This course is directed at early-career researchers, specifically those who are making use of the Resume for Researchers format (e.g., via the UK Research and Innovation (UKRI), which is a non-departmental public body of the Government of the United Kingdom that directs research and innovation funding). Many funding agencies, the industry and corporate sector, and universities now require a more narrative-style CV to incorporate qualitative aspects into job applications (e.g. particularly in relation to describing input to publications, and the significance of these).
The goal of these formats is to help researchers to share their varied contributions to research in a consistent way and across a wide range of career paths and personal circumstances, and to move away from relying on narrowly focused performance indicators that can make it harder to assess, reward or nurture the full range of contributions that a researcher or academic makes to their field or discipline. This course helps researchers to structure, write, and craft a narrative CV to highlight and emphasize their individual academic accomplishments, contributions with a particular emphasis on 'how' they contributed rather than only 'what' they contribute.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.