The proliferation of tools and technologies to measure research impact can be overwhelming.
What do we mean by “research impact”? We’re talking about the influence and visibility of publications and researchers in their field.
What are the tools we can use to help measure impact? How can we use them effectively? This presentation aims to give a high-level outline and overview of these tools.
Although we have new technology to help us gauge research impact, essentially our goals have not changed. When considering what technology will work best for you, keep your goals in mind.
Photo Credit: https://www.flickr.com/photos/24809504@N07/3423035341/, Chris Halderman via http://compfight.com Rights: https://creativecommons.org/licenses/by-nd/2.0/
In most cases, researchers are concerned with measuring and increasing their impact to obtain PROMOTION & TENURE.
They are also looking to build their research network – find new colleagues, research partners, and grow their conference presentation opportunities. They are also looking to increase their citation counts by making their research outputs more visible. Researchers are also looking to manage their reputation – develop their “brand” or academic persona. This is increasingly important in our web-based searching world. And of course – it bears repeating – PROMOTION & TENURE. At a higher level, institutional administrators or those in research-related offices may be looking at the research output of the university as a whole, to develop priorities and make funding decisions.
Photo Credit: https://www.flickr.com/photos/51035706835@N01/154077776, Matthew McVickar via http://compfight.com Rights: https://creativecommons.org/licenses/by-nc-sa/2.0/">cc</a>
Let’s start with one of the basic concerns: Measuring the influence and ranking of a journal. It’s important to note that no single measure fits all disciplines and each discipline – even each academic department within each university – needs to identify measures & criteria that are compatible with their field. While Impact Factors are a well-known and longstanding way to measure the impact of journals, but may not be the best measure for some fields – very few journals may have impact factors within fields like Education, Kinesiology, Literature, etc. SJR, SNIP, Google Journal Metrics, or even much narrower lists developed by a smaller group of researchers or an academic department based on their own criteria might be better. It’s important to note that each measure listed above is related to a particular data source. For instance, a journal can only have an Impact Factor if it’s included in a Web of Science Citation Index. A journal can only have an SJR or SNIP if it’s included in Elsevier’s SCOPUS. Google’s Journal Metrics are compared against what’s within Google Scholar. This is not an exhaustive list. There are many more, and each academic department may choose to use these factors as only part of their evaluative criteria. Some grant funding agencies may ask researchers to provide an Impact Factor or related measure for each journal they’ve published in, but increasingly this is diversifying. NOTE: Texas State does not have access to Scopus at this time. It is being presented today by representatives from Elsevier.
We clearly have a diversity of standards. It seems every year a new standard for measuring journal impact is published, but no universal standard exists, and likely will never exist.
Researchers often need to know how influential their particular publications are. This is usually done by counting the number of citations. It’s important to note that no single tool can provide a complete picture of citation count. A researcher may need to look at citation counts in Web of Science Citation Databases, Scopus, Google Scholar, plus a subject-specific database. Also, some disciplines cite more than others – articles in the natural sciences have more citations than articles in the humanities, thus causing a variance in citation counts. NOTE: Texas State does not have access to Scopus at this time.
To discover how many times an article has been cited, one can use Web of Science Citation Databases, Elsevier Scopus, Google Scholar, and core disciplinary databases within the field. Within these one can also find an alternative measure – the h-Index (h = the creator’s name was Hirsch). The h-Index is often assigned to a researcher – it attempts to look at the entire research output of an author and assign a value. It is likely that newer researchers will have lower scores. The h-Index uses a single database as its data source, so an author will have a different h-Index in Scopus than in Web of Science. Relatedly, authors can download the Publish or Perish software to get the h-Index as well as measures of their publications’ impacts within the Google Scholar universe. It’s important to note that the Texas State has all of the Web of Science Citation Databases, including the new Book Citation Index and Data Citation Index, which will help us understand how books and datasets are being cited. Texas State does not have Scopus at this time.
In our online social network world, there are many new ways a publication can influence a community, gain visibility, or attract attention to a researcher or a research project, and, in turn, bring attention to a university or university program. The traditional tools we have don’t capture this new aspect of academia, even though this aspect increasingly helps researchers achieve their goals of promotion, tenure, and visibility. Over the past several years, companies have developed new technologies to help capture a fuller picture of research impact & visibility by incorporating metrics such as social sharing, online mentions, citations on websites such as Wikipedia, and downloads or views on the web. Each of these tools works slightly differently, but the aim is the same – to help researchers and administrators get a fuller picture of the effect, influence, and visibility of research output, in a new, popular academic arena – the open Web. Texas State does not have these tools at this time. They are being presented today.
University administrators and those within research-related offices now have a variety of tools to help them evaluate and analyze the research output of their university’s community of researchers as a whole. Elsevier’s SciVal uses SCOPUS and other Elsevier resources as its data source, and Thomson Reuters’ InCites uses Web of Science and other Thomson Reuters resources as its data source. Texas State does not have access to these tools at this time.
An author might publish under a variety of names during their career – including simple variance due to different standards – Smith, Thomas vs Smith, T.J. vs Smith, Thomas J. Furthermore, many researchers share the same first and last names. Consequently, the major citation resources developed author disambiguation tools to help unify all the work of one researcher under one profile AND distinguish authors with the same name from each others. Within the Thomson Reuters universe, this is called ResearcherID. Within the Elsevier universe, this called the Scopus Author Identifier. Over the past several years, an open source identifier has emerged – the ORCID. This is increasingly being asked for by grant agencies. There are now tools to link your ORCID to your ResearcherID and your Scopus Author Identifier. There is increasingly advocacy within institutions and the research community for researchers to obtain their identifier, as soon as possible. (ORCID = Open Researcher and Contributor ID)
An increasingly important aspect of academic life is maintaining an academic profile on a networking site, similar to Facebook or LinkedIn. Academic social networking sites are effective at helping researchers connect and identify potential collaborations, promote their research and make it easier to find via Google searching, increase citation counts, and develop an academic persona. While the idea of persona, reputation, or personal brand management can be somewhat off-putting, these new ways of connecting and promoting one’s research are helpful towards reaching traditional goals like promotion & tenure. These sites are free but there is an institutional version of Mendeley that Texas State does not have access to at this time.
Many of the academic network sites give users an option to upload full PDFs of their articles. Please note that you should be cautious when doing so. When you publish an article you no doubt signed an author agreement that gave copyright ownership to the publisher. You likely are restricted in what you can make available openly on the web. Publishers are increasingly issuing takedown notices to authors who upload the full PDFs of their articles on sites like ResearchGate and Academia.Edu. If you still have your author agreement you can read it to see what you are able to share. You can also search the Sherpa/Romeo website to see what the rights are for the particular journal you published in. There may be a way to upload a pre-pub version of your article. You can also upload the articles you have rights to open share to Texas State’s Digital Collections Repository, which is indexed by Google.
Librarians at Texas State are knowledgeable about tools & technologies related to research impact. Contact your subject librarian with any questions. Also, check out the guides we’ve created to explain and link to more information about these topics.
Research IMPACT: Tools & Technologies
Interim Head Acquisitions Librarian
Texas State University
14 November 2014
• Impact Factor (Thomson Reuters JCR/Web of Science)
• Eigenfactor (public but based onThomson Reuters JCR)
• SCImago Journal Rank Indicator (SJR) (Elsevier Scopus)
• Source Normalized Impact per Paper (SNIP) (Scopus)
• Google Journal Metrics (Google Scholar)
• European Reference Indicator for the Humanities (ERIH)
Journal Influence & Ranking
NO MEASURE FITS ALL DISCIPLINES:
IDENTIFY MEASURES & CRITERIA THAT
ARE COMPATIBLE WITHYOUR FIELD
NO SINGLE TOOL PROVIDES A COMPLETE CITATION COUNT
CITATION COUNTS VARY ACROSS DISCIPLINES
Author & Publication Influence & Impact
Citations of a book in Bar-Ilan, J. (2010) Citations to the “Introduction to informetrics”
indexed byWOS, Scopus and Google Scholar. Scientometrics, 82(3), p.504
• Times Cited / Citation Counts
• Web of Science Citation Databases
• Google Scholar
• Core Disciplinary Database (IEEE, MathSciNet, psycINFO)
• Database relative (h-Index in Scopus vsWeb of Science)
• Publish or Perish
• Software for analyzing impact within Google Scholar
Author & Publication Influence & Impact
Citation Analysis & Journal Impact Factor
Copyright (Author’s Rights and more…)
Faculty Toolkit: Altmetrics & Academic Social Media
Faculty Toolkit: Measuring Scholarly Impact
For Further Information