Bibliometrics: Oldies, Goodies, Freebies SLA 2014


Published on

Presentation for Special Libraries Association 2014 Annual Conference

Published in: Education, Technology, Design
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Origins and applications of bibliometrics
    Scopus, Google Scholar, and Web of Science
    Game-changers in bibliometrics
    New twists on an old theme – new metrics
    Free resources for bibliometric indicators/rankings etc.

  • What is bibliometrics? We all know that we need to cite our work. Plagiarisim, etc.
    Measurement of who cited whom.
    Scholarly applications include:
    Research Patterns
    Evolution of Knowledge
    Impact, Productivity, Prestige

  • ISI Indexes”: Science Citation Index, Social Science Citation Index, Arts and Humanities Index
    “father of citation analysis” developed the first bibliometric index tools
  • Usually for authors or groups of authors, articles, or sometimes journals
  • Measures “impact” of a journal (not an article) within a given subject
    Basically “how fast are ideas spreading from this journal to other publications?”

  • Cannot be used to compare cross disciplinary (per Garfield himself) due to different rates of publication and citation,
    Five year impact factor added
    Two year time frame not adequate for non-scientific disciplines
    Coverage of some disciplines not sufficient in the ISI databases
    Is a measure of “impact” a measure of “quality
  • The San Francisco Declaration on Research Assessment (DORA), initiated by the American Society for Cell Biology (ASCB) together with a group of editors and publishers of scholarly journals, recognizes the need to improve the ways in which the outputs of scientific research are evaluated. The group met in December 2012 during the ASCB Annual Meeting in San Francisco and subsequently circulated a draft declaration among various stakeholders. DORA as it now stands has benefited from input by many of the original signers listed below. It is a worldwide initiative covering all scholarly disciplines. We encourage individuals and organizations who are concerned about the appropriate assessment of scientific research to sign DORA.
  • Journals employ several strategies to artificially raise the impact factor, which initiates a positive feedback loop by incentivizing more scientists to submit to them. Some editors have been caught trying to induce authors to increase the number of citations from their journal to further raise the impact factor. One investigation found that a collaboration of Brazilian journals had agreed to highly cite each other’s articles to fraudulently raise each of the journals’ impact factors.
  • Traditional: Journal Impact Factor, Citation Count, half-life
    Citation count does not indicate patterns in citing references over time or distributed across publications
    Journal Impact Factor and half life measure only very recent impact
  • One hit wonder in the green line, sustained career in the red line Blue line is x=y
  • Yellow page is ranked as more relevant because many pages contain links to it. A lot of pages link to the blue page, which makes it more influential in the yellow pages’ ranking.
  • Web of Science: 1970’s, Eugene Garfield, panel of expert reviewers for inclusion of journals
    Scopus: 1996, panel of reviewers, beware if non Elsevier, all cited references may not be linked yet
    Google Scholar: 1990’s, no panel of reviewers, whatever the world puts up on the web
  • “influence” of more heavily cited journals citing journal X lends a higher eigenfactor score to journal x
    Also available through WoS now too
  • From thompson Reuters
    Quick and dirty” articles on hot researchers, trending research topics, institutions and journals
    Promotes analytical products being sold by Thompson; no longer free
    Hit or miss information, not searchable
  • “current “average prestige per paper”/does scimago use eigenvector
    Uses scopus data
    Citation time window is 3 years (instead of 2 JIF and 5 Eigenvector)
    Corrects for self citations
    Because it is per paper, it accounts for differences in journal size
  • Does the citation pattern matter or just the count?
    Does the database being used cover my subject as thoroughly as possible?
    To what degree does my subject area rely on non-journal scholarly publications?
  • Bibliometrics: Oldies, Goodies, Freebies SLA 2014

    1. 1. Bibliometrics Resources: Oldies, (new) Goodies, and Freebies Special Libraries Association Annual Conference Sunday, June 8, 2013 Vancouver, BC, Canada Elaine M. Lasda Bergman Senior Assistant Librarian University at Albany
    2. 2. Bibliometrics Photo source:
    3. 3. • Citation count • Impact Factor • Immediacy Index • Citation Half-Life Origins of Bibliometrics Photo:
    4. 4. • Number of times cited • Does not take into account – Materials not included in citation database – Self citations (possibly) Citation Count Photo source:
    5. 5. Journal Impact Factor Journal Citation Reports Number of citations to a journal in a given year from articles occurring in the past 2 years Divided by the number of scholarly articles published in the journal in the past 2 years
    6. 6. • Domain/Discipline Specific • 2 Year Time Frame • Adequacy of Journal Coverage • Impact = Quality??? Concerns
    7. 7. Newer, Bigger Concerns with JIF
    8. 8. New Twists on an Old Theme
    9. 9. What is that H-index, anyway?
    10. 10. 0 5 10 15 20 25 30 1 2 3 4 5 6 7 NumberofCitations Article Number H-index Scholar A Scholar B H-index Example Scholar A Scholar B 10 27 10 12 9 5 8 4 7 4 6 2 6 2 56 citations 56 citations 6 h-index 4 h-index
    11. 11. Influence of Google Page Rank Source: created by Felipe Micaroni Lalli
    12. 12. Sources of Citation Data
    13. 13. Free Web Sources Using WoS Data
    14. 14. Eigenfactor
    15. 15. Science Watch
    16. 16. Free Web Sources Using SCOPUS data
    17. 17. SJR:SCImago Journal Rank
    18. 18. Journal M3trics
    19. 19. Free Web Sources Using Google Scholar
    20. 20. Publish or Perish
    21. 21. Clean PoP
    22. 22. PoP Metrics • Papers • Citations • Cites/paper • Cites/author • Papers/Author • Authors/Paper • H index • G index • Hc Index • HI index • HI, Norm • Hm Index • E-index • AWCR • Per Author AWCR
    23. 23. Google Scholar Citations
    24. 24. Other Assorted Tools
    25. 25. Scholarometer
    26. 26. Microsoft Academic
    27. 27. Keep In Mind • Journal Quality ≠ Article Quality • Citing a work ≠ Agreement • Self Citations • Pattern Variance from Domain to Domain
    28. 28. Also… • Trend vs. citation count • Database Coverage • Non-scholarly publications
    29. 29. Thank You for coming Elaine M. Lasda Bergman University at Albany