Defining new metrics for library success


Published on

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • mindy
  • Alternate subtitle – from the trenches
  • - Three sections – time for relevant discussion/questions after each one. We are NOT here to explain metrics in detail, but rather to discuss conceptually what our role is and how we can help our faculty and grad students.
  • mindy
  • -not easily measured and debate on effectiveness of measures currently used
  • The basic premise of most citation metrics is that the more an article is cited the more impact it has. Counts of citations are then used to calculate a range of metrics that are believed to indicate the research impact of journals, journal articles, and authors of these articles. Impact – reach and significance (based on assessment criteria from UK research excellence framework 2014)- This is quantitative, what about qualitative – particularly for social sciences?DISCUSSION POINT
  • - Although this is logical order, I will be talking about article level metrics last as they are the most discussed at this juncture and are the most in flux.
  • Also immediacy index and total citesThe journal Impact Factor is the average number of times articles from the journal published in the past two (or 5) years have been cited in the JCR year.The median age of the articles that were cited in the JCR year. Half of a journal's cited articles were published more recently than the cited half-life.  Only journals cited 100 or more times in the JCR year have a cited half-life.The Eigenfactor Score calculation is based on the number of times articles from the journal published in the past five years have been cited in the JCR year, but it also considers which journals have contributed these citations so that highly cited journals will influence the network more than lesser cited journals. The Article Influence determines the average influence of a journal's articles over the first five years after publication.  It is calculated by dividing a journal’s Eigenfactor Score by the number of articles in the journal, normalized as a fraction of all articles in all publications.  (1=average)The Immediacy Index is the average number of times an article is cited in the year it is published.The total number of citations to the journal in the JCR year.SJR is weighted by the prestige of a journal Subject field, quality, and reputation of the journal have a direct effect on the value of a citation. SJR assigns relative scores to all of the sources in a citation network. Its methodology is inspired by the Google PageRank algorithm, in that not all citations are equal. A source transfers its own 'prestige', or status, to another source through the act of citing it. A citation from a source with a relatively high SJR is worth more than a citation from a source with a lower SJR.SNIP is the ratio of a source's average citation count per paper and the citation potential of its subject field.The citation potential of a source's subject field is the average number of references per document citing that source. It represents the likelihood of being cited for documents in a particular field. A source in a field with a high citation potential tends to have a high impact per paper.h5-index is the h-index for articles published in the last 5 complete years.h5-median for a publication is the median number of citations for the articles that make up its h5-index.
  • Rita and gail’s slides
  • “Impact factors declared unfit for duty”“Do not resuscitate: the journal impact factor declared dead”“Just say no to impact factors”
  • Note: To use citation counts, a set timeframe (usually 1 year) must be includedBased on data within each database so calculations of same metric will differ as the papers and their citation counts will differ based on contentTo expand, in our experience for faculty whose subject area does not lend itself to these providers, e.g. linguistics, we have gone to the web to find other sources and helped them calculate their own h-index.The i10-index indicates the number of academic papers an author has written that have at least ten citations from others.
  • Reference to grad studentsHighlight limit to h-index (plus only uses data from database so each provider may calculate a different value)
  • - Teaser for next segment – other possibilities…
  • pam
  • Priem, J & Hemminger, B. H. (2010) Scientometrics 2.0: New metrics of scholarly impact on the social Web. First Monday 15 (7). Retrieved from
  • “on the slow side” Open access on the conference circuitPosted on July 6, 2013 by Stephen Curry, Reciprocal Space Research Council Action Plan Europe Statement of Commons, Business, Innovation and Skills Committee - Fifth Report,Open Access and funder mandates are driving OA; e.g. Finch report and funding now available from the RCUK, FASTR bill introduced to Congress Feb. 14Picking up speed - 1200 journal added to DOAJ in 2012; 8566 journals (Jan. 2013); new open monograph publishing options.All disciplines - MLA endorsement of OA; not just the sciencesDifferent versions – pre, post, final;Multiple copiesNew modes of publishing – providing new ways to distribute research; new tools for authors for self publishing, use of the Creative Commons licences; new look at metrics|&mode=simple
  • ORCID (Open Researcher and Contributor ID) subset of International Standard Names Identifier http://orcid.orgGoggle Scholar Claim Project (JISC)
  • Enis, Matt. As University of Pittsburgh Wraps Up Altmetrics Pilot, Plum Analytics Announces Launch of Plum X. Library Journal February 5, 2013PeerEvaluatonResearchScorecardReaderMeterScholarometer
  • for WordPress provides ALMs for 1 or more articles (
  • From: Lin & Fenner. Altmetrics in evolution: defining and redefining the ontology of article-level metric. Information Standards Quarterly, (2013) 25(2), 25.
  • “Single aggregated score for all metrics” Lin & Fenner 2013, p.26Altmetric data from
  • or Group profiles with details and summaries of metrics“Artifact” or item view with altmetric counts Author or Group visualizations weighted by impactAuthor, Group and Artifact-level widgetsTracts 2nd level metrics e.g. tweets of news article about X
  • Slide courtesy of Mike Buschman, Plum Analytics
  • Beall
  • Proposal to study, propose, and develop community-based standards or recommended practices in the field of alternative metrics. Todd Carpenter (NISO) and Nettie Lagace (NISO) March 19, 2013. Sud & Thelwall “to identify contexts in which it is reasonable to use them” p. 11See also ACUMEN EU funded project (academic careers understood through measurement and norms)
  • mindy
  • - teaching grads and comments from grads about faculty thoughts on metrics - consider each a faculty workshop – talked to department heads - pressures in other countries, altmetrics catch our attention
  • pam
  • Defining new metrics for library success

    1. 1. Mindy Thuna Science Liaison Pam King Scholarly Communication Liaison
    2. 2. This work is licensed under: You are free to: Share — copy and redistribute the material in any medium or format Adapt — remix, transform, and build upon the material Under the following terms: Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. NonCommercial — You may not use the material for commercial purposes. ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.
    3. 3. Altmetrics in Higher Ed Settings metrics in context - What our researchers are saying
    4. 4. Agenda • Impact Metrics • Altmetrics • What our researchers have to say • Bringing it back to libraries
    6. 6. Impact Metrics are… …calculations/algorithms that quantify the impact of research or scholarly activity.
    7. 7. Research impact defined ―Impact is usually demonstrated by pointing to a record of the active consultation, consideration, citation, discussion, referencing or use of a piece of research.‖
    8. 8. Basic Premise Number of citations Impact Reach and significance
    9. 9. Three main providers • Scopus • Web of Science • Google Scholar
    10. 10. Three Levels • Journal Level • Article Level • Author Level
    11. 11. Journal Level Metrics Web of Science Impact Factor (2 & 5 year) Cited half- life Eigenfactor® Score Article Influence® Score Scopus SCImago Journal Rank (SJR) Source Normalized Impact per Paper (SNIP) Google Scholar h5-index h5-median
    12. 12. Journal Impact Factor Average number of times articles published in a two year (or five year) period have been cited A = total number of times ALL articles published in 2 (or 5) year period were cited in WofS indexed journals during the next year B = total number of "citable items" (usually articles, reviews, proceedings or notes; not editorials and letters-to-the-editor) published in 2 (or 5) year period Impact factor = A/B
    13. 13. Journal Impact Factor Debate • Vary across disciplines • e.g., variants in time-to-publication • Review journals have higher impact • fewer articles per journal, cited more • Calculations easily skewed • heavy editorial/letter/opinion content • high/low percentage of review articles
    14. 14. Declared Dissent – 10,419 • ―Impact factors declared unfit for duty‖ Stephen Curry, LSE Blog • ―Do not resuscitate: the journal impact factor declared dead‖ Brendan Crabb, The conversation • ―Just say no to impact factors‖ Ismael Rafols and James Wilsdon, the guardian DORA: San Francisco Declaration on Research Assessment (Dec. 2012)
    15. 15. Author Level Metrics Web of Science h-index Scopus h-index Google Scholar h-index i10-index
    16. 16. h index is NOT perfect Paper 1 Paper 2 Paper 3 Paper 4 Paper 5 Author A 3 cites 6 cites 100 cites 4 cites 1 cite Author B 3 cites 6 cites 100 cites Author C 400 cites 150 cites 3 cites 6 cites h index = 3
    17. 17. Article Level Metrics Citation Counts - how often a particular journal article is being cited by other articles
    19. 19. Web metrics • Webometrics - link analysis, web log file analysis, 1990s • Bibliometric use – downloads, citations • Google web page ranking • Web 2.0 ->scientometrics 2.0 (Priem & Hemminger 2010)
    20. 20. Open Access • Producing a highly complex information environment; touches all disciplines • Many formats, e.g. websites, nanopubs, blog posts, code, images • versions, e.g. preprints, postprints, ―same article different places‖, version of record
    21. 21. Identification Systems Authors • International Standard Name Identifier (ISNI) • ORCID (subset of ISNI) • Names Project (JISC) • Author Claim, RePEc Author Service • arXiv Author ID • ResearchID (Thomson Reuters) • Scopus Author ID (Elsevier) • Gravatars (Auttomatic) Objects • DOIs, PMIDs URLs, URIs
    22. 22. Altmetrics … are calculated from newer data sources, e.g. Wikipedia, Mendeley, Twitter, Facebook, Weibo, blogs … report the impact of a wider range of research outputs, e.g. presentation slides, data sets, articles, code
    23. 23. Altmetrics aggregators/tools/ services • PLoS One Article Level Metrics (ALM) • ImpactStory • Altmetric • PlumX • and others …
    24. 24. PLoS Article Level Metrics (ALMs)
    25. 25. ImpactStory
    26. 26. 30
    27. 27. Altmetric (
    28. 28. PlumX
    29. 29. The Debate ―Article-Level Metrics: An Ill-Conceived and Meretricious Idea‖ vs ―Broaden your horizons: impact doesn‘t need to be all about academic citations‖ (Jeffrey Beall vs Euan Adie, Aug. 2013)
    30. 30. Moving forward ―to move out of its current pilot and proof-of-concept phase‖ • Sloan foundation grant for NISO initiative - creation of community- based standards; definitions, calculations. Data classifications and data sharing practices
    32. 32. Our Ongoing Study ―Research Impact Metrics: Use and Understanding by Post- Tenure Faculty‖
    33. 33. What do you think of when you hear the term impact metrics? ―… I don't think that impact factors at the journal level necessarily captures the importance of the paper... I jokingly say that out of all the things I have published, the paper that I expect will have the longest life is the one that's got the lowest impact factor.‖ Biology
    34. 34. What do you think of when you hear the term impact metrics? ―Nothing I don‘t understand what your talking about. No clue. I understand ‗impact‘ I understand those words separately. But not together.‖ Mathematical and Computational Sciences ―Means nothing to me – zero.‖ Political Science
    35. 35. What do you think of when you hear the term impact metrics? ―I have a picture of that book that you find at the Robarts library on the 4th floor that is a collection of binders in colourful colours and when you open it, it looks like the phone book with all the list of all the articles and researchers so that is my idea of impact metrics.‖ Language Studies
    36. 36. Have you considered/do you use any alternative metrics? ―No. No trust in them.‖ Mathematical and Computational Sciences
    37. 37. ―… I give a lot of relevance to those metrics and I will tell you why…There was an event in the year 1900. It was some popular festivity in England and they were running a contest. There was a cow or a bull and people had to guess it's weight and the person who gets closes to the weight wins the cow, or something like that. And someone kept a record of the answers and it is very interesting. It was very wide-ranging so let's say the weight of the cow was 500 kg. So the range was from 300-1000. The most accurate answer was the average of all the answers that people gave…So what is that telling you. It is telling you many things…It is telling you that this phase in which we are moving into in which content is user generated, there is a lot of proof there. You have to remember, I am a mathematician, I don't care what people think, we build truth from the bottom up…‖ Mathematical and Computational Sciences
    39. 39. Bringing it back to libraries • Traditional metrics • Libraries provide the support • Libraries pay for the metrics • Libraries purchase the research output • Altmetrics change the rules • Now what happens? • It all depends on what libraries want to measure…. • tweets and blogs about our buildings and spaces and our service • Bookmarks of websites and libguides • Likes on Facebook • Downloads of librarian publications
    40. 40. Thoughts?