Bibliometrics: Now There Are Options


Published on

Presentation about new bibliometric tools using citation data from ISI, Scopus, and Google Scholar for University at Albany Librarians, 4/28/11

  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • G index, contemporary h index, factors in age of articles, individual h index: per author, hm index, corrects for multiple authors by reducing paper counts,
  • Bibliometrics: Now There Are Options

    1. 1. Bibliometrics :<br />Now There are Options<br />Elaine M. Lasda Bergman<br />Bibliographer for Social Welfare <br />and Dewey Reference<br />Dewey Graduate Library<br />April 28, 2011<br />
    2. 2. What is bibliometrics?<br />Scholarly communication<br />Influence of articles, journals, scholars<br />
    3. 3. The birth of citation analysis<br />Eugene Garfield<br />Citation indexes and JCR<br />Better coverage on hard sciences than on social sciences and worse still on humanities<br />
    4. 4. Garfield’s metrics<br />Citation count<br />Impact Factor<br />Immediacy Index<br />Citation Half-Life<br />
    5. 5. Citation count<br />Number of times cited within a given time period<br />Author<br />Journal<br />Does not take into account<br />Materials not included in citation database<br />Self citations<br />
    6. 6. Impact factor<br />Measures “impact” of a journal (not an article) within a given subject <br />Formula is a ratio:<br />Number of citations to a journal in a given year from articles occurring in the past 2 years Divided by the number of scholarly articles published in the journal in the past 2 years<br />
    7. 7. Concerns with impact factor<br />Cannot be used to compare cross disciplinary (per Garfield himself) due to different rates of publication and citation<br />Two year time frame not adequate for non-scientific disciplines<br />Coverage of some disciplines not sufficient in the ISI databases<br />Is a measure of “impact” a measure of “quality”?<br />
    8. 8. Immediacy index<br />What it’s supposed to measure: how quickly articles in a given journal have an impact on the discipline <br />Formula: the average number of times an article in a journal in a given year was cited in that same year<br />
    9. 9. Citation Half-Life <br />What it’s supposed to measure: duration of relevance of articles in a given journal<br />Formula: median age of articles cited for a particular journal in a given year<br />
    10. 10. Twenty first century tools<br />
    11. 11. Influence of Google Page Rank<br />Eigenvector analysis:<br />“The probability that a researcher, in documenting his or her research, goes from a journal to another selecting a random reference in a research article of the first journal. Values obtained after the whole process represent a ‘random research walk’ that starts from a random journal to end in another after following an infinite process of selecting random references in research articles. A random jump factor is added to represent the probability that the researcher chooses a journal by means other than following the references of research articles.” (Gonzales-Pereira,, 2010)<br />
    12. 12. Sources Using ISI Data<br />
    13. 13.<br />Journal uses ISI data and eigenvector (PageRank) algorhythm to create one’s own categories<br />Can assign different weights to citations from the same journal, the same category and from other categories or only whithin a specific list<br />Not updated since 2005<br /><br />
    14. 14.<br />Uses ISI data<br />Similar to PageRank<br />Listed in JCR as of 2009<br />Eigenfactor Score :<br />Influence of the citing journal divided by the total number of citations appearing in that journal<br />Example: Neurology (2006): score of .204 = an estimated 0.2% of all citation traffic of journals in JCR (Bergstrom & West, 2008).<br />Larger journals will have more citations and therefore will have larger eigenfactors<br />
    15. 15. Article Influence Score <br />From Eigenfactor: measure of prestige of a journal<br />Average influence, per article of the papers on a journal<br />Comparable to the Impact Factor <br />Corrects for the issues of journal size in the raw Eigenfactor score<br />Neurology’s 2006 article influence score = 2.01. Or that an avg. article in Neurology is 2X as influential as an avg. article in all of JCR<br />
    16. 16. ScienceWatch<br />Provides “quick and dirty” articles on hot researchers, trending research topics, institutions and journals<br />Much on this site (in-cites, etc) are now parts of analytical products being sold byThompson; no longer free<br />There are still some good articles, but not searchable, hit or miss information<br /><br />
    17. 17. New sources for citation information<br />Google Scholar<br />Scopus<br />
    18. 18. Scopus: alternate database of citation data<br />Review panel, i.e., quality control<br />Bigger field than ISI: covers all the journals in WoS and more<br />Strongest in “hard”sciences”, ostensibly improved social science coverage, arts and humanities: are “getting there”<br />Algorithmically determined with human editing<br />
    19. 19. Google Scholaralternate database of citation data<br />No rhyme or reason to what is included<br />Biggest source of citation data<br />Foreign language sources<br />Sources other than scholarly journals<br />Entirely algorithmically determined, no human editing<br />
    20. 20. Scopus analytics<br />SNIP<br />SJR/SCIMago<br />Author Evaluator<br />
    21. 21. SNIP (Source Normalized Impact Per Paper)<br />Journal Ranking based on citation analysis with adjustments for the frequency of citations of the other journals within the field (the field is all journals citing this particular journal)<br />SNIP is defined as the ratio of the journal’s citation count per paper and the citation potential in its subject field. (Moed, 2009)<br /><br />
    22. 22. SJR:SCImago Journal Rank<br />What it’s supposed to measure: “current “average prestige per paper”<br />SCImago website uses journal/citation data from Scopus, and is also available from scopus db<br />Formula: citation time window is 3 years instead of 2 like JIF<br />Corrections for self citations<br />Strong correlation to JIF<br />
    23. 23. SCImago Journal Rank <br />Prestige factors include: number of journals in db, number of papers from journal in database, citation numbers and “importance” received from other journals: size dependent: larger journals have greater prestige values<br />Normalized by the number of significant works published by the journal: helps correct for size variations<br />Corrections made for journal self citations<br />
    24. 24. Scopus Author Evaluator<br />Breakdown of documents by source<br />H-index<br />Citations per year (graph)<br />
    25. 25. Google Scholar<br />Publish or Perish<br />CIDS<br />
    26. 26. Publish or Perish<br />Provides a variety of metrics for measuring scholarly impact and output.<br />More useful for metrics on authors than journals or institutions<br />Uses Google Scholar citation information<br />Useful for interdisciplinary topics, fields relying heavily on conference papers or reports, non-English language sources, new journals, etc.<br />Continuously updated since 2006<br />
    27. 27. Publish or Perish Metrics<br />Basic metrics:<br /># papers, #citations, active years, years since first published, average #of citations per paper, average # of citations per year, average # citations per author, etc.<br />Complex metrics<br />H index (and its many variations, mquotient, g-index (corrects h-index for variations in citation patterns), AR index, AW index<br />Does not have any corrections for SELF CITATIONS<br />
    28. 28. CIDS<br />Measures output of authors for prestige and influence<br />Similar to PoP<br />Corrects for Self-Citations<br />Uses Google Scholar data<br />
    29. 29. CIDS metrics<br />Citations per year, h-index, g-index, total citations, avg cites per paper, self citations included and excluded, etc.<br /><br />
    30. 30. Mesur<br />Metric based on usage, citation and bibliographic data<br />Uses its own datbases of documents/metadata/reference, users & authors, “usage events” and citations<br />Project seems to be dead?<br />
    31. 31. Considerations<br />Don’t measure an individual journal’s impact by the metrics for the entire journal<br />Cluster of years of citations <br />Negative citations<br />A few high impact citations or a lot of low impact ciations<br />Source of citing documents<br />Foreign, conference proceedings, traditional<br />
    32. 32. Questions???<br />