Created as a supplement for the Impact Measurement collection of the ScholarlyCommunication …
Created as a supplement for the Impact Measurement collection of the ScholarlyCommunication Notebook (SCN) to describe some of the core literature in the field as well asresources that cannot be included on the SCN, because they are not openly licensed but arefree to read.This annotated bibliography is separated into three sections: Peer reviewed scholarly articles,Blog posts, initiatives, and guides, and Resources for further education and professionaldevelopment. The first section is intended to help practitioners in the field of researchassessment and bibliometrics to understand high-level core concepts in the field. The secondsection offers resources that are more applicable to practice. The final section includes links toblogs, communities, discussion lists, paid and free educational courses, and archivedconferences, so that practitioners and professionals can stay abreast of emerging trends,improve their skills, and find community. Most of these resources could not be included on theScholarly Communication Notebook, because they are not openly licensed. However, allresources on this bibliography are freely available to access and read.
This review was commissioned by the joint UK higher education (HE) funding …
This review was commissioned by the joint UK higher education (HE) funding bodies as part of the Future Research Assessment Programme (FRAP). It revisits the findings of the 2015 review The Metric Tide to take a fresh look at the use of indicators in research management and assessment.
While this review feeds into the larger FRAP process, the authors have taken full advantage of their independence and sought to stimulate informed and robust discussion about the options and opportunities of future REF exercises. The report should be read in that spirit: as an input to ongoing FRAP deliberations, rather than a reflection of their likely or eventual conclusions.
The report is written in three sections. Section 1 plots the development of the responsible research assessment agenda since 2015 with a focus on the impact of The Metric Tide review and progress against its recommendations. Section 2 revisits the potential use of metrics and indicators in any future REF exercise, and proposes an increased uptake of ‘data for good’. Section 3 considers opportunities to further support the roll-out of responsible research assessment policies and practices across the UK HE sector. Appendices include an overview of progress against the recommendations of The Metric Tide and a literature review.
What does it mean to have meaningful metrics in today’s complex higher …
What does it mean to have meaningful metrics in today’s complex higher education landscape? With a foreword by Heather Piwowar and Jason Priem, this highly engaging and activity-laden book serves to introduce readers to the fast-paced world of research metrics from the unique perspective of academic librarians and LIS practitioners. Starting with the essential histories of bibliometrics and altmetrics, and continuing with in-depth descriptions of the core tools and emerging issues at stake in the future of both fields, Meaningful Metrics is a convenient all-in-one resource that is designed to be used by a range of readers, from those with little to no background on the subject to those looking to become movers and shakers in the current scholarly metrics movement. Authors Borchardt and Roemer, offer tips, tricks, and real-world examples illustrate how librarians can support the successful adoption of research metrics, whether in their institutions or across academia as a whole.
This UK report presents the findings and recommendations of the Independent Review …
This UK report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration. This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. The report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the ‘gaming’ of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy. Finally, it examines the role that metrics played in REF2014, and outlines scenarios for their contribution to future exercises.
The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to …
The Metrics Toolkit co-founders and editorial board developed the Metrics Toolkit to help scholars and evaluators understand and use citations, web metrics, and altmetrics responsibly in the evaluation of research.
The Metrics Toolkit provides evidence-based information about research metrics across disciplines, including how each metric is calculated, where you can find it, and how each should (and should not) be applied. You’ll also find examples of how to use metrics in grant applications, CV, and promotion packages.
This guide has been created by bibliometric practitioners to support other users …
This guide has been created by bibliometric practitioners to support other users of InCites, a research analytics tool from Clarivate Analytics that uses bibliographic data from Web of Science; the guide promotes a community of informed and responsible use of research impact metrics. The recommendations in this document may be more suited to other academic sector users, but the authors hope that other users may also benefit from the suggestions. The guide aims to provide plain-English definitions, key strengths and weaknesses and some practical application tips for some of the most commonly-used indicators available in InCites. The indicator definitions are followed by explanations of the data that powers InCites, attempting to educate users on where the data comes from and how the choices made in selecting and filtering data will impact on final results. Also in this document are a comparative table to highlight differences between indicators in InCites and SciVal, another commonly used bibliometric analytic programme, and instructions on how to run group reports. All of the advice in this document is underpinned by a belief in the need to use InCites in a way that respects the limitations of indicators as quantitative assessors of research outputs. Both of the authors are members of signatory institutions of DORA, the San Francisco Declaration on Research Assessment. A summary of advice to using indicators and bibliometric data responsibly is available on pages 4-5 and should be referred to throughout. Readers are also recommended to refer to the official InCites Indicators Handbook produced by Clarivate Analytics. The guide was written with complete editorial independence from Clarivate Analytics, the owners of InCites. Clarivate Analytics supported the authors of this document with checking for factual accuracy only.
No restrictions on your remixing, redistributing, or making derivative works. Give credit to the author, as required.
Your remixing, redistributing, or making derivatives works comes with some restrictions, including how it is shared.
Your redistributing comes with some restrictions. Do not remix or make derivative works.
Most restrictive license type. Prohibits most uses, sharing, and any changes.
Copyrighted materials, available under Fair Use and the TEACH Act for US-based educators, or other custom arrangements. Go to the resource provider to see their individual restrictions.