Whether preparing a portfolio or reassessing a research topic, metrics help quantify scholarly impact. Traditional metrics such as the h-index or Impact Factor assist in this endeavor but often fall short in capturing all sides of the story. This informational session, tailored to faculty but open to all, focuses on Alternative Metrics: what they are, where they can be accessed, and how they can be used to demonstrate impact.
TeamStation AI System Report LATAM IT Salaries 2024
Beyond Citations - NEIU NETT Day Presentation on Altmetrics
1. Beyond Citations
Demonstrating your Impact through
Alternative Metrics
October 14, 2014
Kelly Grossmann, Science Librarian
MS Information, MS Bioinformatics
2. Outline
1. How and why do we assess impact?
2. Methods and Metrics for Impact Assessment
- Traditional
- Alternative
3. Improving your Impact
3. How do we assess the impact of
research?
… and why should we?
10. For Faculty & Researchers
“Publish or Perish”
- Showing academic value in a sea of
information.
Increased importance of relevance and
significance of research
11. For Students
Evaluate sources and identify core papers
Follow key research trends
Perhaps even demonstrate your scholarly impact for
graduate school
17. Impact Factor
Journal level metric
Average number of citations to articles within journal
Easily Googled
Allows comparison by subject in Journal Citation Reports
Proprietary
23. Google to Find h-index
http://scholar.google.com
Use author profile to find author h-index
Use Journal metrics to find journal h-index (average)
Allows you to compare impact within your field
25. Criticism of Traditional Methods
Depend on Counters
“Gaming the system”
Slow
Vary by time and length of
career
Gender bias in referring
practices
Skew towards favoring popular
Do not connotate
negative/positive references
Vary greatly by discipline
Often only available by Journal
(IF & EF)
28. Altmetrics
Short for Alternative Metrics
Methods of analyzing impact beyond citations
Made possible by our ability to quickly, electronically share
information.
42. Where to Share
GitHub - Code
FigShare - Datasets, images, videos
SlideShare - Presentations
Twitter - Social networking
Facebook - Social networking
43. In summation...
Many metrics, many tools.
Combination of metrics is best.
Be sure to compare within your discipline.
Be sure to share your work to improve your
impact.
45. References
[1] Sci2 Team. Science of Science (Sci2) Tool. User Manual. http://wiki.cns.iu.edu/pages/viewpage.action?pageId=2200066
[5] Priem, J. "MEDLINE-indexed articles published per year." R Chart. Jason Priem/blog 18 Oct. 2010. Accessed 10 Oct. 2014 <http://jasonpriem.org/2010/10/medline-literature-growth-chart/>.
[6] Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C,et al. (2011) “The Development of Open Access Journal Publishing from 1993 to 2009.” Figure 2: The development of open access publishing.
PLoS ONE 6(6): e20961. doi:10.1371/journal.pone.0020961.
[7] Van Noorden, R. “Science publishing: The trouble with retractions.” Box: Rise of the retractions. Nature. Published online 5 Oct. 2011. Accessed 10 Oct. 2014. doi: 10.1038/478026a
<http://www.nature.com/news/2011/111005/full/478026a/box/2.html>
[7] Bohannon, J. “Who’s Afraid of Peer Review?” Science. 4 Oct. 2013. 342(6154). pages 60-65. doi: 10.1126/science.342.6154.60 <http://www.sciencemag.org/content/342/6154/60.full>
[8] McGovern, V. “Foundation funding and chemical biology.” Trends in research funding by agency. Nature Chemical Biology. 4, 519-522. 2008. doi: 10.1038/nchembio0908-519.
<http://www.nature.com/nchembio/journal/v4/n9/fig_tab/nchembio0908-519_F1.html>
[12, 13] Sci2 Team. Science of Science (Sci2) Tool. Indiana University and SciTech Strategies, (2009). https://sci2.cns.iu.edu.
[19] University of Washington. “Overview”. Eigenfactor information page. <http://www.eigenfactor.org/methods.php> Image: A model of research. <http://www.eigenfactor.org/images/animatedfigure.gif>
Accessed on 10 Oct. 2014.
[20] Vulpecula (User name). "h-index" h-index from a plot of decreasing citations for numbered papers. Wikipedia. 02 Jan. 2008. Accessed 10 Oct. 2014 <http://en.wikipedia.org/wiki/H-index>.
[25] Wendl, M.C. “H-index: however ranked, citations need context.” Nature. (2007) 449: 403.
[25] Kelly, C. D. , Jennions, M.D. “H-index: age and sex make it unreliable.” Nature. (2007) 449: 403.
[25] Pagel, P.S., Hudetz, J.A. “H-index is a sensitive indicator of academic activity in highly productive anaesthesiologists: results of a bibliometric analysis.” Acta Anaesthesiologica Scandinavica. (2011).
55:9. 1085-1089. dio: 10.111/j.1399-6576.2011.02508.x <http://onlinelibrary.wiley.com/doi/10.1111/j.1399-6576.2011.02508.x/abstract>
[26] West Side Story. Dir. Jerome Robbins and Robert Wise. Perf. Natalie Wood. Comp. Leonard Bernstein. Mirisch Corporation, 1961. Film.
[29] Careless, J. (2013). Altmetrics 101: A Primer. (cover story). Information Today, 30(2), 1-36. [Accessed October, 2014].
[32] Lowman, M. (2014). How to Raise a Woman Scientist. Retrieved 13 Oct. 2014, from http://www.huffingtonpost.com/meg-lowman/how-to-raise-a-woman-scie_b_5928644.html
[34, 35] Mendeley. <http://www.mendeley.com/catalog/modularity-community-structure-networks-27/> Accessed 10 Oct. 2014
[37] Thomson, J. “Altmetrics added to Royal Society of Chemistry Journals”. RSC Publishing Blog. 12 Sept. 2013. Accessed on 10 Oct. 2014. http://blogs.rsc.org/rscpublishing/2013/09/12/altmetrics-added-to-
royal-society-of-chemistry-journals/
46. Further Reading
Altmetrics Manifesto http://altmetrics.org/
University of Maryland Altmetrics LibGuide
http://lib.guides.umd.edu/altmetrics
HLWiki International:
http://hlwiki.slais.ubc.ca/index.php/Author_impact_metrics
Full citations available in presentation notes.
# of articles indexed in MEDLINE (the database of PubMed) has grown almost exponentially since 1950, trending upward in the late 90’s.
Source:
Priem, J. "MEDLINE-indexed articles published per year." R Chart. Jason Priem/blog 18 Oct. 2010. Accessed 10 Oct. 2014 <http://jasonpriem.org/2010/10/medline-literature-growth-chart/>.
The web has allowed for the growth of open access journals and thus an increase in open access articles, via a more rapid rate of production and review. This graph was developed by a group of economics researchers and published in an open access journal. The analysis focused on the rate of publication of articles and the increasing prevalence of open access journals. Here we see that the number of articles published has grown greatly along with the number of open access journals.
Source:
Laakso M, Welling P, Bukvova H, Nyman L, Björk B-C,et al. (2011) “The Development of Open Access Journal Publishing from 1993 to 2009.” Figure 2: The development of open access publishing. PLoS ONE 6(6): e20961. doi:10.1371/journal.pone.0020961.
With more articles, come more retractions. Retractions have actually grown at a faster rate than the rate of number of article growth.
I am not attributing this trend to the open access publication practices (Although, read John Bohannon’s “Who’s Afraid of Peer Review?” to learn more about open access journals with unacceptable publication practices), this increase in retractions more likely has to do with better research oversight committees (Van Noorden), and the culture of academic publication (publish or perish pressure).
Source:
Van Noorden, R. “Science publishing: The trouble with retractions.” Box: Rise of the retractions. Nature. Published online 5 Oct. 2011. Accessed 10 Oct. 2014. doi: 10.1038/478026a <http://www.nature.com/news/2011/111005/full/478026a/box/2.html>
Bohannon, J. “Who’s Afraid of Peer Review?” Science. 4 Oct. 2013. 342(6154). pages 60-65. doi: 10.1126/science.342.6154.60 <http://www.sciencemag.org/content/342/6154/60.full>
After inflation is accounted for, we see a townward trend in public funding for the sciences since 2004.
McGovern, V. “Foundation funding and chemical biology.” Trends in research funding by agency. Nature Chemical Biology. 4, 519-522. 2008. doi: 10.1038/nchembio0908-519. <http://www.nature.com/nchembio/journal/v4/n9/fig_tab/nchembio0908-519_F1.html>
The study of scholarly communication has become a science in and of itself. Sci2 is an example of a data analysis tool that can be used to examine networks of scholarly communication.
Sci2 Team. Science of Science (Sci2) Tool. Indiana University and SciTech Strategies, (2009). https://sci2.cns.iu.edu.
Sci2 Team. Science of Science (Sci2) Tool. Indiana University and SciTech Strategies, (2009). https://sci2.cns.iu.edu.
The following tools are a handful of the traditional metrics used to analyze scholarly impact.
The most traditional way of assessing impact. An article gets published, the more that additional articles refer to it, the more citation counts it has.
Article level
Note the differences in citations.
These are limited to the journals indexed in Web of Science and can only be located for articles listed in Biological Abstracts (due to our access restricitons). More articles can be found on Web of Science, but you must access them as a visitor of the UIC Library.
The difference in citation count differences can also be attributed to in accurate indexing in google scholar, double counting some citations.
Google scholar is also counting the citations of questionable ‘scholarly’ journals
Cons
Proprietary, algorithm is not public
Available only at the Journal level
Available only for journals in the ISI lists.
Should be compared to other journals in your field for a more accurate representation of impact.
Can be gamed by self citations.
An example of a very high impact factor.
Pros:
A more robust algorithm than IF, accounts for the impact of citations from influential journals (determined by the citations of the citing journals).
Scores the journals .01-100, all scores adding up to 100.
Really beautiful data visualizations.
Not proprietary.
Easily found via Google or the Eigenfactor website.
Makes it easy to compare to other Eigenfactors in your field.
Cons:
Only available for ISI Journals.
Only on the journal level.
Source:
University of Washington. “Overview”. Eigenfactor information page. <http://www.eigenfactor.org/methods.php> Image: A model of research. <http://www.eigenfactor.org/images/animatedfigure.gif> Accessed on 10 Oct. 2014.
AKA Hersch factor.
Traditionally used in science.
Pros:
Author level
Can be calculated for any author in any discipline (not limited to the ISI list).
Cons:
Varies greatly by discipline
Image Source:
User name: Vulpecula. "h-index" h-index from a plot of decreasing citations for numbered papers. Wikipedia. 02 Jan. 2008. Accessed 10 Oct. 2014 <http://en.wikipedia.org/wiki/H-index>.
h-index: 4 articles with at least 4 cited by citations, h-index = 4
i10 index: 2 articles with >10 cited by citations, i10 = 2
[Thank you to Lisa Wallis for allowing us to view her publications]
Can be used by anyone, including students to evaluate the impact of a journal.
Can also be used to compare the average h-indexes of journals.
Step 1. Visit scholar.google.com
Step 2. Click “Metrics”
Step 3. Adjust by field using links on the left column of the stage.
You may view the profiles of some authors (if they are set up) by clicking on the author’s name in the Google Scholar citation.
If you are an author, you can setup your google scholar profile to help collect metrics.
Wendl, M.C. “H-index: however ranked, citations need context.” Nature. (2007) 449: 403.
Kelly, C. D. , Jennions, M.D. “H-index: age and sex make it unreliable.” Nature. (2007) 449: 403.
Pagel, P.S., Hudetz, J.A. “H-index is a sensitive indicator of academic activity in highly productive anaesthesiologists: results of a bibliometric analysis.” Acta Anaesthesiologica Scandinavica. (2011). 55:9. 1085-1089. dio: 10.111/j.1399-6576.2011.02508.x <http://onlinelibrary.wiley.com/doi/10.1111/j.1399-6576.2011.02508.x/abstract>
Image from: West Side Story. Dir. Jerome Robbins and Robert Wise. Perf. Natalie Wood. Comp. Leonard Bernstein. Mirisch Corporation, 1961. Film.
Originally, article level metrics, failed to capture the measures vast array of additional, non-traditional methods to analyze an individual's impact.
Fittingly enough, the term was first coined in a tweet. - J. Priem https://twitter.com/jasonpriem/status/25844968813
Careless, J. (2013). Altmetrics 101: A Primer. (cover story). Information Today, 30(2), 1-36. [Accessed October, 2014].
Screen cap from: http://www.huffingtonpost.com/jeremy-scheinberg/fostering-an-early-love-o_b_5941804.html?utm_hp_ref=girls-in-stem
An example of a non scholarly publication that could still have a public or scholarly impact.
Lowman, M. (2014). How to Raise a Woman Scientist. Retrieved 13 Oct. 2014, from http://www.huffingtonpost.com/meg-lowman/how-to-raise-a-woman-scie_b_5928644.html
Most well known, most traditional. Can collect the number of readers of an article and examine the article’s popularity in various fields.
Criticism- Should the number go down if an item is removed from a library?
From Mendeley. <http://www.mendeley.com/catalog/modularity-community-structure-networks-27/> Accessed 10 Oct. 2014
From Mendeley. http://www.mendeley.com/research-papers/ Accessed 10 Oct. 2014.
Very handy tool for collecting altmetrics on articles.
Visit the URL. Drag the bookmarklet to your bookmarks bar.
Open an article online.
Click the Altmetric Bookmark.
A popup should appear with Altmetrics for the article.
Publishers are now making altmetrics available.
Thomson, J. “Altmetrics added to Royal Society of Chemistry Journals”. RSC Publishing Blog. 12 Sept. 2013. Accessed on 10 Oct. 2014. http://blogs.rsc.org/rscpublishing/2013/09/12/altmetrics-added-to-royal-society-of-chemistry-journals/
Some are even encouraging authors to promote using social media.
http://journalauthors.tandf.co.uk/pdfs/socialmedia-infographic.pdf
For the authors, there are a number of data aggregation tools to help you with your impact assessment.
We are no longer limited to one type of publication (scholarly articles) and one type of data (citation counts)
Enhance impact by increasing readership. Publicize your research!