This document discusses altmetrics, which are alternative metrics for measuring research impact beyond citations. It provides examples of altmetric data sources like tweets, blogs, and news articles. The document also presents case studies of researchers and articles to demonstrate altmetric measurements. It discusses issues around gaming the system and outlines future directions for altmetrics, including increased transparency, standards development, and assessing correlations with other impact measures.
I apply Ranganathan's 5 laws of library science to altmetrics, as part of a holistic research impact support service. I discuss what altmetrics are, what they measure, their uses throughout the research lifecycle, and where you can get them. I then apply Ranganathan's 4th law, saving the time of the user, to the harvesting of altmetrics by research information systems, embedding them at the point of need. The challenge of altmetrics is to change our concept of what an institutional repository is, from a simple container of research outputs to a smart system that harvests and catalogs a much broader range of output and impact data elements.
Joining the ‘buzz’ : the role of social media in raising research visibilityEileen Shepherd
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
Updated 30/01/2015
This session included discussions around the value of bibliometrics for individual performance management/promotion and the REF.
What are bibliometrics?
Journal metrics
Personal metrics
Article level metrics and altmetrics
I apply Ranganathan's 5 laws of library science to altmetrics, as part of a holistic research impact support service. I discuss what altmetrics are, what they measure, their uses throughout the research lifecycle, and where you can get them. I then apply Ranganathan's 4th law, saving the time of the user, to the harvesting of altmetrics by research information systems, embedding them at the point of need. The challenge of altmetrics is to change our concept of what an institutional repository is, from a simple container of research outputs to a smart system that harvests and catalogs a much broader range of output and impact data elements.
Joining the ‘buzz’ : the role of social media in raising research visibilityEileen Shepherd
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
Updated 30/01/2015
This session included discussions around the value of bibliometrics for individual performance management/promotion and the REF.
What are bibliometrics?
Journal metrics
Personal metrics
Article level metrics and altmetrics
Tweet Your Pubs: How Altmetrics are Changing the Way We Measure Research ImpactRobin Featherstone
Presentation given to the Northern Alberta Health Libraries Association (NAHLA) Trends Mini Conference in Edmonton at the University of Alberta on May 2, 2014
Brace for Impact: New Means for Measuring Research MetricsMary Ellen Sloane
As open access journals and repositories gain a foothold in scholarly communication, researchers are finding that the traditional impact factor and citation count metrics only reflect a portion of the dissemination of scholarly works.
New technology, research, and citation tools aid our ability to measure the influence of research. A matrix of tools and initiatives, like PLoS Article-Level Metrics, BePress’ Author Dashboard, Mendeley, Altmetrics, and ImpactStory are providing a more robust picture of scholarly communication today.
This presentation provides an overview of the impact factor system and new tools for gathering metrics and their relevance for librarians and researchers.
Presentation given at the Library Information Technology Association (LITA) Forum in Louisville, KY, in November 2013.
Dear Editor: I read your publication ethics issue on “bogus impact factors” with great interest (1). I would like to initiate a new trend in manipulating the citation counts. There are several ethical approaches to increase the number of citations for a published paper (2). However, it is apparent that some manipulation of the number of citations is occurring (3, 4). Self - citations, “those in which the authors cite their own works” account for a significant portion of all citations (5). With the advent of information technology, it is easy to identify unusual trends for citations in a paper or a journal. A web application to calculate the single publication h - index based on (6) is available online (7, 8). A tool developed by Francisco Couto (9) can measure authors’ citation impact by excluding the self - citations. Self - citation is ethical when it is a necessity. Nevertheless, there is a threshold for self - citations. Thomson Reuters’ resource, known as the Web of Science (WoS) and currently lists journal impact factors, considers self - citation to be acceptable up to a rate of 20%; anything over that is considered suspect (10). In some journals, even 5% is considered to be a high rate of self - citations. The ‘Journal Citation Report’ is a reliable source for checking the acceptable level of self - citation in any field of study. The Public Policy Group of the London School of Economics (LSE) published a handbook for “Maximizing the Impacts of Your Research” and described self - citation rates across different groups of disciplines, indicating that they vary up to 40% (11). Unfortunately, there is no significant penalty for the most frequent self - citers, and the effect of self - citation remains positive even for very high rates of self - citation (5). However, WoS has dropped some journals from its database because of untrue trends in the citations (4). The same policy also should be applied for the most frequent self - citers. The ethics of publications should be adhered to by those who wish to conduct research and publish their findings.
This presentation is about Scholarly Communications and how it works, what are ways through one can identify right journals for publications and also briefly discusses preprints as an alternative publications space for making the research more open and visible.
How to measure research impact on the webKinga Hosszu
This presentation explains how research impact measurement has changed with the advent of the internet, and provides examples of how impact can be measurement using several online tools.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Tweet Your Pubs: How Altmetrics are Changing the Way We Measure Research ImpactRobin Featherstone
Presentation given to the Northern Alberta Health Libraries Association (NAHLA) Trends Mini Conference in Edmonton at the University of Alberta on May 2, 2014
Brace for Impact: New Means for Measuring Research MetricsMary Ellen Sloane
As open access journals and repositories gain a foothold in scholarly communication, researchers are finding that the traditional impact factor and citation count metrics only reflect a portion of the dissemination of scholarly works.
New technology, research, and citation tools aid our ability to measure the influence of research. A matrix of tools and initiatives, like PLoS Article-Level Metrics, BePress’ Author Dashboard, Mendeley, Altmetrics, and ImpactStory are providing a more robust picture of scholarly communication today.
This presentation provides an overview of the impact factor system and new tools for gathering metrics and their relevance for librarians and researchers.
Presentation given at the Library Information Technology Association (LITA) Forum in Louisville, KY, in November 2013.
Dear Editor: I read your publication ethics issue on “bogus impact factors” with great interest (1). I would like to initiate a new trend in manipulating the citation counts. There are several ethical approaches to increase the number of citations for a published paper (2). However, it is apparent that some manipulation of the number of citations is occurring (3, 4). Self - citations, “those in which the authors cite their own works” account for a significant portion of all citations (5). With the advent of information technology, it is easy to identify unusual trends for citations in a paper or a journal. A web application to calculate the single publication h - index based on (6) is available online (7, 8). A tool developed by Francisco Couto (9) can measure authors’ citation impact by excluding the self - citations. Self - citation is ethical when it is a necessity. Nevertheless, there is a threshold for self - citations. Thomson Reuters’ resource, known as the Web of Science (WoS) and currently lists journal impact factors, considers self - citation to be acceptable up to a rate of 20%; anything over that is considered suspect (10). In some journals, even 5% is considered to be a high rate of self - citations. The ‘Journal Citation Report’ is a reliable source for checking the acceptable level of self - citation in any field of study. The Public Policy Group of the London School of Economics (LSE) published a handbook for “Maximizing the Impacts of Your Research” and described self - citation rates across different groups of disciplines, indicating that they vary up to 40% (11). Unfortunately, there is no significant penalty for the most frequent self - citers, and the effect of self - citation remains positive even for very high rates of self - citation (5). However, WoS has dropped some journals from its database because of untrue trends in the citations (4). The same policy also should be applied for the most frequent self - citers. The ethics of publications should be adhered to by those who wish to conduct research and publish their findings.
This presentation is about Scholarly Communications and how it works, what are ways through one can identify right journals for publications and also briefly discusses preprints as an alternative publications space for making the research more open and visible.
How to measure research impact on the webKinga Hosszu
This presentation explains how research impact measurement has changed with the advent of the internet, and provides examples of how impact can be measurement using several online tools.
This presentation was provided by Emma Warren-Jones of Scholarcy, during the NISO event "Researcher Behaviors and the Impact of Technology," held on March 25, 2020.
Lecture given to Unit 8 (INDS 208) -- Pathobiology Treatment and Prevention of Disease -- in the undergraduate medical curriculum at McGill University on September 10, 2012.
Reputation, impact, and the role of libraries in the world of open scienceKeith Webster
An overview of the relationship between open science, research assessment, university rankings, and the role of librarians in advancing the research university
Presentation at the Philippine National Health Research Week preconference meeting: Rallying Communicators for Science, Technology, and Innovation in Health | Society of Health Research Communicators (SHARE). 22 August 2017, Hotel Jen, Manila.
Lecture on "Altmetrics: An Alternative View-Point to Assess Research Impact" in Five days Advanced Training Programme on Bibliometrics and Research Output Analysis during 15th - 20th June, 2015 at INFLIBNET Centre, Gandhinagar.
Joining the ‘buzz’ : the role of social media in raising research visibility ...Eileen Shepherd
[This presentation is based on my previous presentation, of the same title, at the LIASA 2014 conference. It was presented as a webinar for LIASA Higher Education Libraries Interest Group on 6/11/2014]
Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics or article level metrics). Altmetrics measures impact of research, data and publications, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media. This presentation gives a brief background to altmetrics and demonstrates how Rhodes University librarians are using social media to raise the visibility of the research output of their institution. (Rhodes University is in Grahamstown, South Africa)
Joining the ‘buzz’ : the role of social media in raising research visibility at Rhodes University, Grahamstown, South Africa - HELIG Webinar presented by Eileen Shepherd
Librarians & altmetrics: Tools, tips and use casesLibrary_Connect
Altmetrics are becoming an integral part of looking at the impact and reach of research. Tracking social and online outlets, altmetrics provide quick feedback from a wide range of sources. In this webinar, library experts will discuss how altmetrics work, tools available, and the application of altmetrics in a range of institutions and for various user groups. Watch the webinar: http://ow.ly/vNeax
WEBINAR: Joining the "buzz": the role of social media in raising research vi...HELIGLIASA
Joining the ‘buzz’ : the role of social media in raising research visibility: Traditional bibliometric methods of evaluating academic research, such as journal impact factors and article citations, have been supplemented in the past 5-10 years by the development of altmetrics (alternative metrics/article level metrics). Altmetrics measures aspects of the impact of a work, such as references in data and knowledge bases, article views, downloads and mentions in social media and news media.
This webinar (based on a presentation of the same name at the LIASA conference on 24th September 2014) gives a brief background to altmetrics and demonstrates how Rhodes University, Grahamstown, librarians are using social media to raise the visibility of the research output of their institution.
Presented by Eileen Shepherd, Principal Librarian, Science & Pharmacy, Rhodes University Library
This concept can be applied to the wisdom of clinicians inside healthcare institutions. By gathering and sharing course content and tools between care facilities, hospitals can be connected to more than just the technical cloud. They can be connected to the wisdom of the cloud.
Lecture to course on team science taught in the MS in Clinical Investigation program, Graduate School of Biomedical Sciences, University of Massachusetts Medical School.
Research-Open Access-Social Media: a winning combination, presented by Eileen Shepherd at the Open Access Symposium on 21 October 2014 - Rhodes University Library
Using alternative scholarly metrics to showcase the impact of your research: ...SC CTSI at USC and CHLA
Date: Feb 7, 2018
Speaker: Caroline Muglia, Co-Associate Dean for Collections and Technical Services; and Head, Resource Sharing and Collection Assessment, USC Libraries
Overview: Scholarship is increasingly being created, disseminated, and measured on digital and social platforms. If Twitter exchanges, Facebook “saves,” and YouTube hits are the new metrics for tracking scholarship, how are we measuring societal and educational impact and outreach? How can researchers display their research impact using social media on promotion and tenure dossiers? This webinar will discuss altmetrics, alternative scholarly metrics that measure the impact and use of scholarship. We will focus on PlumX, the tool used at USC, which combines traditional and new metrics to paint a comprehensive portrait of your scholarly output and its reach in various communities and with different stakeholders.
ALTMETRICS : A HASTY PEEP INTO NEW SCHOLARLY MEASUREMENTSaptarshi Ghosh
The term ‘Altmetrics’ was proposed by Jason Priem, a PhD student at the School of Information and Library Science at University of North Carolina, Chapel Hill through a tweet. [https://twitter.com/asnpriem/status/25844968813].
Altmetrics is the combination of two words such as: ‘Alternative’ and ‘Metrics’ in which the ‘alt-‘part refers to alternative types of metrics (that is alternative to traditional metrics such as citation analysis, impact factor, downloads & usage data etc.).
Altmetrics is the creation and study of new metrics based on the Social Web for analyzing, and informing scholarship (http://altmetrics.org/about/). It is the study of new indicators for the analysis of academic activity based on Web 2.0.
The Kaleidoscope of Impact: same data, different perspectives, constantly cha...Kudos
Scholars, scientists, academic institutions, publishers and funders are all interested in impact. We have different roles and goals, and therefore different reasons for needing to understand impact; we are therefore asking different questions about impact, and those questions continue to evolve, much as the concept of impact itself is evolving. To answer our different questions, do we need different data, in separate silos, or are we looking at the same data, from different angles? This session gathered researcher, library, publisher and metrics provider perspectives to consider who has an interest in impact, what data they are interested in, how they use it, and how the situation is evolving as e.g. business models and technical infrastructures shift.
Research-Open Access-Social Media: A winning combinationEileen Shepherd
This presentation endeavours to show that social media and open access are a great couple, to provide a brief introduction to altmetrics – a non-traditional form of measuring scholarly impact and to demonstrate the use of social media in raising awareness and visibility of Rhodes University research
Communicating for a Research InstitutionKara Gavin
Introduction to why universities and other research institutions employ science/medical communicators, and what their role is and how they can coordinate among communicators from different areas of the same institution or across institutions. Also includes slides on public understanding of science.
Similar to Altmetrics for research: impact measurement &#hcsm (20)
Workshop given at the Medical Library Association Conference in Seattle WA, May 24th, 2012. This course is part of the Medical Library Association's Disaster Information Specialization Program.
Workshop - Disaster Health Information Sources: The BasicsRobin Featherstone
Continuing Education workshop given at the Midcontinental Medical Library Association (MLA) Chapter Meeting in St Louis Missouri on September 21, 2011. Disaster Health Information Sources: The Basics is the foundational class in MLA's Disaster Information Specialization. For more info, see: http://www.mlanet.org/education/dis/
Webinar - Disaster Health Information Sources: The BasicsRobin Featherstone
Webinar workshop given on September 14th and 15th to members of the Medical Library Association (MLA). Disaster Health Information Sources: The Basics is the foundational course in MLA's Disaster Information Specialization. For more info see: http://www.mlanet.org/education/dis/
Altmetrics for research: impact measurement &#hcsm
1. Robin Featherstone, MLIS
Research Librarian,
Alberta Research Centre for Health Evidence
@rmfeatherstone
http://www.slideshare.net/featherr
Altmetrics for research : impact
measurement & #hcsm
http://www.ualberta.ca/ARCHE/
@arche4evidence
Summer 2014
2. Conflict of interest disclosure
• I have had free access to the Altmetric.com
Explorer since September 2013
3. Altmetrics…
[…] capture ways in which articles are
disseminated throughout the expanding
scholarly ecosystem, and reach beyond the
scope of traditional trackers and filters.
[…] measure research impact by including
references outside of traditional scholarly
publishing.
1. Public Library of Science (PLOS). Altmetrics. [29 April 2014].
Available from http://article-level-metrics.plos.org/alt-metrics/
2. Baynes G. Scientometrics, bibliometrics, altmetrics: Some introductory advice for the lost and
bemused. Insights. 2012;25(3):311-5. doi: 10.1629/2048-7754.25.3.311.
4. Altmetric tools measure data
collected from:
• Tweets
• Blog mentions
• Facebook posts
• Presentations
• News articles
• Shared citations (e.g., Mendeley, CiteULike)
• Data uploads
• Etc.
6. What’s your impact3?
3. Emerald Group Publishing. Impact of Research [11 April 2014].
Available from: http://www.emeraldgrouppublishing.com/authors/impact/index.htm
http://blogs.lse.ac.uk/impactofsocialsciences/2011/07/14/publishers-measuring-impact/
7. Agenda
• Researcher & article case studies
• Altmetric products
• Measurable mentions & unique IDs
• Gaming the system
• Altmetric research
• Altmetrics for biomedical and health sciences
• Future directions
8. Researcher case study –
Heather Piwowar
• h-index: 9
– 23 articles cited 280 times
http://www.scopus.com/authid/detail.url?authorId=25122628200
9. Researcher Case Study –
Heather Piwowar
• 40 Slideshares 88 followers
http://www.slideshare.net/hpiwowar
• 107 ImpactStory research products
http://impactstory.org/HeatherPiwowar
• 1469 Figshare views
http://figshare.com/authors/Heather_Piwowar/96399
• 1812 GitHub contributions
https://github.com/hpiwowar
• 52 Google Scholar publications cited 762 times
http://scholar.google.ca/citations?user=1YLq0XMAAAAJ&hl=en
• ResearchGate score: 15.31
• Research Gate Impact Points: 50.29
https://www.researchgate.net/profile/Heather_Piwowar2
10. Article Case Study – Carlisle 2014
• Web of Science times cited: 1
• Google Scholar times cited: 1
• Scopus times cited: 1
18. Q: Which twitter post will be
counted by altmetrics
measurement instruments?
A. Great review on pet ownership for children with
autism by Gretchen Carlisle
B. Great review by @gretchgretch15 on pet ownership
- #autism
C. Great review -
http://dx.doi.org/10.1016/j.pedn.2013.09.005
19. What is gaming?
• Alice has a new paper out. She asks those grad
students of hers who blog to write about it.
• Alice has a new paper out. She believes that it
contains important information for diabetes patients
and so signs up to a ’100 retweets for $$$’ service.
5. Case examples from: Adie, E. Gaming Altmetrics. [29 April 2014].
Available from http://www.altmetric.com/blog/gaming-altmetrics/
21. Spectrum of social media self-
promotion
6. Adie, E. Gaming Altmetrics. [29 April 2014].
Available from http://www.altmetric.com/blog/gaming-altmetrics/
22. Altmetrics research
• PLOS Altmetrics Collection:
http://www.ploscollections.org/altmetrics
• Many studies compared altmetrics with
citation counts and Impact Factors
• Positive but weak correlations found
23. Altmetrics in the health sciences
• ‘Biomedical and health sciences’ showed
highest share of publications with
Altmetric.com scores7
• Increasing numbers of PubMed citations were
being tweeted8
2010 2.4%
2011 10.9%
2012 20.4%
7. Costas R, Zahedi Z, Wouters P. Do altmetrics correlate with citations? Extensive comparison of
altmetric indicators with citations from a multidisciplinary perspective. arXiv preprint arXiv:14014321.
2014.
8. Haustein S, Peters I, Sugimoto CR, Thelwall M, Larivière V. Tweeting biomedicine: An analysis of
tweets and citations in the biomedical literature. Journal of the Association for Information Science
and Technology. 2013.
24. Social Media and health
• Used for:
– Health promotion9, 10
– Education11
– Communication and knowledge sharing12
9. Williams G, Hamm MP, Shulhan J, Vandermeer B, Hartling L. Social media interventions for diet and
exercise behaviours: a systematic review and meta-analysis of randomised controlled trials. BMJ Open.
2014;4(2). doi: 10.1136/bmjopen-2013-003926.
10. Hamm MP, Chisholm A, Shulhan J, Milne A, Scott SD, Given LM, et al. Social media use among
patients and caregivers: a scoping review. BMJ Open. 2013;3(5). doi: 10.1136/bmjopen-2013-002819.
11. Hamm MP, Klassen TP, Scott SD, Moher D, Hartling L. Education in Health Research Methodology:
Use of a Wiki for Knowledge Translation. PLoS ONE. 2013;8(5):e64922. doi:
10.1371/journal.pone.0064922.
12. Hamm MP, Chisholm A, Shulhan J, Milne A, Scott SD, Klassen TP, et al. Social media use by health
care professionals and trainees: a scoping review. Academic medicine : journal of the Association of
American Medical Colleges. 2013;88(9):1376-83. doi: 10.1097/ACM.0b013e31829eb91c.
25. Rank Score Article Title Year Journal
1 1350 Early Television Exposure and
Subsequent Attentional Problems in
Children
2004 Pediatrics
2 558 Do Television and Electronic Games
Predict Children’s Psychosocial
Adjustment? Longitudinal Research
Using the UK Millennium Cohort
Study
2013 Archives of
Disease in
Childhood
3 521 Microbial Contamination of Human
Milk Purchased via the Internet
2013 Pediatrics
4 457 Effective Messages in Vaccine
Promotion: A Randomized Trial
2014 Pediatrics
5 383 Gun Violence Trends in Movies 2013 Pediatrics
Using Altmetrics.com rankings
– Pediatric Journals
32. Future Directions – Research
• Is there a correlation between altmetrics
rankings and…
– full-text downloads?
– qualitative measures of interest?
– …
33. Future Directions - Assessment
• Will be used by:
– funding agencies
– research centres
– Knowledge Translation (KT) researchers
35. The Alberta Research Centre for Health Evidence
Edmonton, Alberta, Canada
Robin Featherstone, MLIS
@rmfeatherstone
feathers@ualberta.ca
Slides available: http://www.slideshare.net/featherr
Editor's Notes
Because it is relevant to this presentation, I’ll point out my two Twitter handles: one for our research centre (@arche4evidence), and one for myself as a researcher (@rmfeatherstone). It is somewhat fitting that I’m including both handles on this introductory slide. I’ll be speaking today about altmetrics as both a researcher who is working to establish my own impact within a community of information scientists, and as an embedded health research librarian who is responsible for the social media presence of my employer and assessing our centre’s social media communication strategies.
The free access Altmetric.com granted me to their Explorer justifies a conflict of interest disclosure. Because of that access, the Explorer is the primary altmetric tool that I use for my work and is (likely) over-represented in the presentation today.
I first heard about altmetrics at the Medical Library Association conference in 2013. The topic of this intriguing new cross-section of social media and article metrics was heard at numerous sessions. Altmetric start-ups were pushing products that measured research in a different way, and librarians were excited to learn more.
Altmetrics measure research impact by including references outside of traditional scholarly publishing (4). These social web metrics were first proposed in 2010 as a response to scholars moving their work online.
As librarians have been explaining for years, there are limits to what IF and h-index figures can tell us. A junior faculty member may have created and shared hundreds of captivating lectures online but only published a few articles. That teaching is not reflected in their h-index. They may author a widely-followed blog in which they engage with an audience of academic peers, but there is no IF for the blog. As numbers of Twitter followers or Facebook friends quantify social media activity, altmetrics measure and rank researcher output, impact and influence from the social web.
In these definitions of altmetrics, there is an emphasis on getting “beyond” or “outside” the traditional ways of measuring research impact.
Altmetrics synthesize data collected from tweets, blogs, presentations, news articles, comments, or any social commentary about a diverse group of scholarly activities that are captured on the web.
The obvious caveat about altmetrics is that they are only valid and valuable for the most recent publications.
Social media mentions are rare for articles published prior to 2011, and altmetric products often exclude older datasets in their analysis.
The fast pace of social media -- the exponential output -- makes it difficult to collect data sets from information telecommunications networks. The numbers are enormous and part of a Big Data challenge. Academic computer gives us better data collection measures and big storage solutions make it possible to analyze.
But how else do we measure information that facilitates change? Surveys. Resources permitting, we should always ask human beings. Altmetrics are most cost-effective, and administrators and accountants w/ numbers they can pull from the web.
I first heard about the concept of impact zones from Lisa Given, an information scientists an expert on qualitative methodology who studies research impact. Impact zones have also been discussed on the London School of Economics Impact Blog.
The model was developed by Emerald Publishing Group and describes 6 zones where your research can have impact. Traditional citation metrics really only tell us about one zone: knowledge. One of the reasons that I’m excited about alternative metrics is that I believe they can help us measure impact in some of the other 5 zones.
Aside from simply defining and justifying alternative metrics for measuring research impact, I’ll be presenting case studies that illustrate how different altmetric products measure the impact of researchers and articles. I’ll share some tips for making sure your social media mentions are counted by altmetric tools. I’ll discuss a particular challenge for altmetric tools of dealing with instances of “gaming” (or manipulating) rankings, and the spam and pay-for-systems that inflate online mentions. I’ll provide a brief overview of altmetrics research and how I use altmetrics in health research. Finally, I’ll describe some future priorities for altmetric products and the opportunities for librarians to use and educate colleagues about altmetrics.
Heather, who is a computer scientist and one of the developers of the altmetric tool ImpactStory, was generous in giving me permission to use her researcher profiles for this case example.
Heather’s h-index, one of the recognized impact measurement tools, is a 9. That number tells us about the 23 selectively-indexed academic journal articles that she’s published and the number of times they’ve been cited (in selectively-indexed academic journals). But it’s not a complete picture of her academic output.
Heather also posts her conference presentations to SlideShare and has 88 followers. She has an impressive number of “research products” that incorporates publications from both inside and outside of traditional academic journals. Her uploads to Figshare have been viewed over 1400 times, and her contributions to data repositories as measured by GitHub are equally impressive. Heather has a Google Scholar profile that showcases citations and publications that are not calculated as part of her h-index. She is part of an online academic community through ResearchGate and has been evaluated highly by her peers. These different altmetric measurements create a very different picture of Heather than her h-index
We can see a similar disconnect between traditional and alternative metrics for this article.
The Altmetric.com ranking for the Carlisle paper tells a different story. Since the article was published in 2014, 16 news outlets have reported on the study, and 22 tweets have mentioned it. Clearly people are discussing and sharing this article. Just a reminder that these are dynamic rankings – the scores for the Carlisle paper may be higher today than when I prepared these slides.
As you can probably tell from these two case studies of a researcher and a publication, different altmetric tools tell different stories.
Since we’re looking at some Altmetric.com data, I’ll start by talking about them. And then move to some of the other major players.
Altmetric.com scores articles with embeddable, donut-shaped badges. Subscription costs vary and the company has been generous to librarians (like myself) in offering free accounts during their start-up phase. Their application programming interface (API) is available for free to any web developer who wants to embed Altmetric.com badges on their site. The company has published extensively on the subject of altmetrics and their data sets are contributing to bibliometric/sociometric/webometric research. Altmetric.com collects and analyzes mentions on social media sites, particularly Twitter and Facebook. The Altmetric.com Explorer searches datasets by keywords and subject headings, but works best with PubMed Identifiers (PMIDs), International Standard Serial Numbers (ISSNs), or Digital Object Identifiers (DOIs).
There are many products, some of which existed (like F1000) before the term “altmetrics” was coined. Currently, the landscape is full of start-ups positioning themselves as investment-worthy knowledge providers. Some altmetric tools tell us about individual articles, and others tell us about researchers, and some tell us about institutions. Researcher-focused products, like ImpactStory and ResearchGate, resemble familiar social networking sites in that they rely on contributors creating and maintaining personal profiles.
The evolution from Facebook to LinkedIn to ImpactStory makes logical sense. User-contributed profiles became online resumes and then dynamic curricula vitae with embedded metrics for research products. For researcher-focused altmetric tools, older publications, presentations and products can be manually added. These products that tell us about researchers are more likely to include contributions prior to 2011, and for that reason are superior for analyzing research output over time than article-focused altmetric tools.
ImpactStory.org is an open-source product that connects PMIDs, DOIs, GoogleScholar citations, ORCID identifiers (unique researcher identifiers) and SlideShare profiles to count “Research Products.” ImpactStory.org creates a free public profile for the individual researcher that includes their Wikis and blogs, and praises their Open Access publications with a medal ranking. ImpactStory helps scholars create and disseminate online resumes, in a similar way to LinkedIn.
ResearchGate.net also claims to measure impact in a new way, and ranks “scientific reputation” through their RG Score. ResearchGate hosts an open platform for researchers to share and discuss their work. Products from researcher profiles contribute to RG Scores, as do evaluations of those products by ResearchGate peers. Aggregated RG Scores are also presented for institutions based on member contributions.
Aggregated RG Scores are also presented for institutions based on member contributions. I find this view of the comparative rankings of academic/scientific institutions based on the cummulative scores of their individual members fascinating. There are Chinese, Brazilian, American, Russian, Swedish, and French institutions in their top ten rankings. ResearchGate really gives you a sense of the international landscape for academic research impact. The results are also customized for Canadian audiences, and will also show me national and North American institutions’ rankings.
From researcher and institution-specific metrics, the altmetric product landscape includes producers of article-level metrics. One of these article-level altmetrics providers is Altmetric.com that I already described. But another worth mentioning is from the publisher PLOS.
PLOS Article-Level Metrics examines the overall “performance and reach” of articles, and is available for every article published by PLOS (Public Library of Science). PLOS Article-Level Metrics aggregate usage data (i.e., downloads), citations, ratings, social networking mentions, blogs and media mentions. Like Altmetric.com, PLOS distributes a free API to share their article metrics on third-party websites.
The development and growth of these online academic communities can be seen in the recent development of PubMed Commons. Not only does it provide a forum for discussion about research, but it provides us with a quantiative measurement in terms of being able to count the number of comments PubMed citations have received.
One of the criticisms of efforts to measure research impact is that they are overly reliant on quantiative assessments. What we’re missing are easily accessible qualitative measures. It’s easy to count someone’s articles or datasets or the comments they’ve received on a paper, but it is much more difficult to gather evidence to support the argument that research has had a positive impact on a community.
Anyone in academia knows that the environment is changing and that we’re constantly having to justify the importance of our research. In some countries, like Australia and the UK, there are formal mechanisms from government agencies to measure the impact of research funded by tax-payers. For a new generation of researchers, it will be imperative that they share their work. And they will want to make sure citations, mentions and uploads are counted.
As expert searchers of grey literature will recognize, it is a nightmare task to capture every social media mention, tweet, blog comment, SlideShare upload, etc. on a particular researcher or publication. The lack of standardization in social media communication results in questionable data accuracy by altmetric providers, and bibliographic analyses using altmetrics include lengthy discussions of the limitations of their data sources.
To achieve accurate records of scholarly output, altmetric products rely on PMIDs, DOIs and ORCID identifiers. One lesson to take away from this presentation: smart self-promoters include a PMID or DOI when they tweet or blog about their research publications. Including unique identifiers is the best way to ensure that social media output is counted by almetric products.
Ethically, is there a difference b/w Alice’s two self-promotion activities? I would say yes. The first scenario shows intent to self-promote, but is also adding value to the research community in that the graduate students have an opportunity to comment and build upon Alice’s research through their publications. The second scenario shows intent, but adds no value. Alice is flooding Twitter with empty self-cites.
Instances of “sock puppet” (or fabricated) social media profiles are an inevitable consequence of open platforms that rely on user contributions. Since altmetrics rely on social networking sites, their data are vulnerable to gaming.
Auto-spamming is another incident to consider. This is an automated Twitter “bot” set up to tweet the water level in a reservoir in South Africa. Each time it does, so it includes a link to a paper.
The mentions are unintentional (the authors of the papers may be unaware that this program is tweeting the same link to their publication over and over). It’s not quite as bad as paying a company to set up “sock puppet” accounts, but it is an example of the kind of spamming which is far more common.
Self-citing/tweeting/blogging, intentional gaming and unintentional spamming are all challenges for altmetric product developers. There are definitely some grey areas between good and bad social media promotion. Producers, like Altmetric.com, have to draw lines and determine if and how to present questionable data.
I’m very impressed by the efforts by Altmetric.com to acknowledge and develop methods for dealing with gaming with transparency. To gain credibility, producers must gain trust by weeding out suspicious data. A comparison can be made to researchers using data quality control mechanisms and accounting for those measures in their methods and discussions sections of their papers.
An excellent collection of research using altmetrics data is maintained by PLOS. Numerous investigations highlighted in the collection have focused on the relationship between altmetrics, citation counts and IF scores. Findings from these studies suggest a positive but weak correlation between altmetrics and traditional impact measures. Owing to the variety of altmetrics rankings and the different methods of comparison, these analyses are limited. And while proponents of altmetrics compare new tools against the standard citation measures, the evidence does not support replacing traditional citation metrics with altmetrics. However, information scientists agree citations and altmetrics measure different types of impact. In a society where social media is pervasive, it means something for a scholarly article mention to receive a million likes on Facebook, independent and regardless of any eventual citation count.
Altmetrics are not limited to any particular field of study. But they are particular relevant for the biomedical and health sciences areas. The article-level altmetrics data shows that social media mentions in this area is high and increasing steadily.
It is outside of the scope of this presentation to describe social media uses in the health professions. I refer you to the research of my colleagues at ARCHE who have been studying social media for health promotion, health education and knowledge sharing.
Reflecting on ARCHE’s recent research on social media, it comes as no surprise that article-level altmetrics suggest high-rates of social media mentions for health-relevant scholarly publications.
I wanted to share a few of the uses of article-level altmetric data for our research centre. One example was a data collection exercise to assist with a research prioritization project. By analyzing the articles with the highest altmetrics scores within pediatric journals, our centre saw that research about television exposure to children was of particular interest during the previous year.
This kind of information provides us with topic areas for knowledge synthesis.
With the Altmetric.com Explorer, we are able to see the dissemination patterns via news media and social networks for a particular article. We can analyze the discourse around that article.
In this case, I was interested in understanding why an outlier article from 2004 continued to be mentioned on social media. I can tell that news agencies were citing and writing this article long after the original publication date. A news story from 2011 inspired a cascade of media mentions that resulted in people Tweeting and blogging and talking about early television exposure through social media. This particular article inspired a lot of discussion around the world.
Whatever you want to call it, I believe this information suggest real impact for researchers, parents, and health providers.
Another example of how we use Altmetrics is to chart the performance of our own publications. This can give us a sense of which areas of research are of particular interest and which journals promote greater social media activity.
The score for the confidence intervals score is higher now than when I took this screen capture – illustrating that interest in this article continues to grow.
Analysis of the altmetric data shows us high interest from Great Britain and allows us to identify communities of researchers around the world who are discussing our research. Based on profile information, we can learn what percentages of our audience for a particular topic are members of the public, scientists, practitioners, or science communicators (i.e., journalists). We can look at the individual tweets of blog articles or Facebook mentions and learn more about how research is received by these audiences.
As illustrated by the promotion of impact zones by Emerald Publishing Group, and PLOS Article-Level Metrics, publishers are among the most ardent of altemtrics adopters. Altmetrics’ application programming interfaces (APIs) are now a common feature on journals’ table of content pages, as in this example from the Cochrane Library. Publishers will increasingly promote these metrics.
As far as promotion goes, altmetrics are a cost-effective way of showing a publication’s value. APIs are also easy ways of making website dynamic and appealing.
No one blames the researcher who wants to give their publication a little boost with a tweet or a Facebook mention, but the methods for collecting and displaying evidence of researcher output ought to make such activities transparent. Just as self-citing does not qualify as unethical, neither does self-tweeting; but those self-tweets should have lesser value or be regarded in a separate context from legitimate sharing by unbiased experts. In addition, data collector spamming needs to be detected and eliminated by altmetrics providers through transparent methods. Automated programs that “game” the analysis and inflate rankings hamper producers. Anyone using altmetrics should recognize this gaming phenomenon and the potential for research rankings to be artificially inflated.
Among future challenges for altmetrics start-ups will be standardizing methods of counting the online artifacts of research output. An analogy can be made between the need to have COUNTER compliant statistics for journal usage. Without standardization in altmetrics, these tools are just comparing apples to oranges (or pears). A researcher could have over 1000 “products” in one altmetric measurement, but a low “rank” and fewer “artifacts” in another. When an altmetric product achieves equivalent recognition of the gold standard IF, then deans can review tenure dossiers with these values.
Information scientists agree citations and altmetrics measure different types of impact.
A more fruitful approach to webometrics research will combine IF and other available datasets with altmetrics to examine social media use and knowledge dissemination strategies. For example, using altmetrics with publisher data could tell us how many times an article was tweeted compared to how many times the full-text was downloaded. We can compare downloads for high and low altmetrics scores, or between and across fields for different social media sites.
Research can also help us to understand what altmetrics are measuring. If they are measuring researcher interest, then qualitative research can confirm that assertion.
Among the potential users of almetrics for assessment will be funding agencies. Altmetrics are readily-available (i.e., cost-effective) quantitative indicators of a return on funding investments. In the case of research centres, altmetrics can help us target promotion activities or even prioritize future research. Librarians can use altmetrics to track performance of particular journals, or to help make acquisitions decisions. Alternative metrics also provide insight into the results of social media engagement strategies and deserve to be integrated into Knowledge Translation (KT) assessment.
I hope efforts to measure research output will take into consider an expanded view of what it means to make a positive impact. I also hope that more qualitative measurement methods will compliment the overly quantitative landscape of simply counting “mentions,” “products,” or “activities.”
And I hope too that you’ve enjoyed this introduction to altmetrics. Thank you for your attention, and I’m happy to answer any questions.