Bibliometrics (1) JIFs and JCRs

3,490 views

Published on

Mini slideshow for use in web pages at http://www.dur.ac.uk/library/research/bibliometrics/

0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,490
On SlideShare
0
From Embeds
0
Number of Embeds
2,739
Actions
Shares
0
Downloads
21
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Expanding your search is about two things:Thinking about all possible terms/spellings/concepts which might be applicable to what you are looking at, to ensure you don’t accidentally rule out any useful result simply due to semantics.If you have focussed a search too much, and aren’t finding the results you expect, about widening your net to see what else you can find.You’ll have some keywords in mind but as few of you will be experts in this area you will need to think about how others have framed research in this field. You will find that the keywords you use will change once you start looking for information and finding relevant resources. You can borrow their search terms and add to your list.synonyms: e.g. survey or questionnaire – make sure you don’British and American spellings: use wildcards e.g. colo?r finds colour and colortruncation: e.g. educati* finds education, educating, educationalist
  • Eugene Garfield founded the Institute for Scientific Information (ISI). In 1955 proposed a methodology for the creation of a citation index for science.
  • Citation index not a completely new idea in other contexts, such as the legal field where case citation indexes to track the subsequent treatment of judicial decisions had been in place in the US since 1873, and in the UK from 1947.From this, came his first mention of using this data to measure the “impact factor” – in this initial paper, at an article level, but later at the journal level.
  • One key tool in this area is WoS. Stress WoS and not WoK.Indexes approx 12, 000 journals. Aim to rank journals within categories.Not designed to make judgements about performance but now a shift in that direction they have developed tools to help in this respect. But WoS indexes subject areas to varying degreesMention Scopus / Google Scholar
  • One key tool in this area is WoS. Stress WoS and not WoK.Indexes approx 12, 000 journals. Aim to rank journals within categories.Not designed to make judgements about performance but now a shift in that direction they have developed tools to help in this respect. But WoS indexes subject areas to varying degreesMention Scopus / Google Scholar
  • One key tool in this area is WoS. Stress WoS and not WoK.Indexes approx 12, 000 journals. Aim to rank journals within categories.Not designed to make judgements about performance but now a shift in that direction they have developed tools to help in this respect. But WoS indexes subject areas to varying degreesMention Scopus / Google Scholar
  • Benefits – small time frame so stops bias towards older journalsBias to those which publish a lot of review articles as they are more likely to widely citedBias towards Eur and North American – remember only journals in this database and there are few LOTE (languages other than English) in here
  • DEMO 1 – JIFsJCR – Sci or SocScied 2011 data latest issueSubject categories and journal rankingIndividual titles
  • JCR 2012 should be available c. June/July 2013Explain that denominator and numerator are not based on the same criteria. Citations in 2011 to all articles published by Journal X in 2009 & 2010 (Numerator) is everything cited in a journalNumber of articles that were published in Journal X in 2009 & 2010(Denominator) is all “research” articles (excluding letters, opinion papers, etc)
  • JCR 2012 should be available c. June/July 2013Explain that denominator and numerator are not based on the same criteria. Citations in 2011 to all articles published by Journal X in 2009 & 2010 (Numerator) is everything cited in a journalNumber of articles that were published in Journal X in 2009 & 2010(Denominator) is all “research” articles (excluding letters, opinion papers, etc)
  • This graph shows how it can even vary quite a bit in a single subject area – therefore no generalisations can be made or comparisons between subject areas. An impact factor of 1.5 could be excellent for one discipline but sub-standard for another.
  • This graph shows how it can even vary quite a bit in a single subject area – therefore no generalisations can be made or comparisons between subject areas. An impact factor of 1.5 could be excellent for one discipline but sub-standard for another.
  • Possibly, in terms of scholarly citations, but it is not as simple as this. But there is far more to consider – is the journal accessible to a wide audience (eg, is the research hidden behind a pay wall which only some researchers can access), is the readership of the journal pro-active in reporting or citing the research via other means (eg twitter, blogs).
  • Eigenfactor: “The Eigenfactor Score calculation is based on the number of times articles from the journal published in the past five years have been cited in the JCR year, but it also considers which journals have contributed these citations so that highly cited journals will influence the network more than lesser cited journals.  References from one article in a journal to another article from the same journal are removed, so that Eigenfactor Scores are not influenced by journal self-citation.”Article influence factor: The Article Influence determines the average influence of a journal's articles over the first five years after publication.  It is calculated by dividing a journal’s Eigenfactor Score by the number of articles in the journal, normalized as a fraction of all articles in all publications.  This measure is roughly analogous to the 5-Year Journal Impact Factor in that it is a ratio of a journal’s citation influence to the size of the journal’s article contribution over a period of five years.The mean Article Influence Score is 1.00. A score greater than 1.00 indicates that each article in the journal has above-average influence. A score less than 1.00 indicates that each article in the journal has below-average influence.
  • Bibliometrics (1) JIFs and JCRs

    1. 1. Part 1 Journal metrics
    2. 2. Journal Ranking 1955 Eugene Garfield - the idea of creating a citation index for science to…
    3. 3. Journal Ranking “eliminate the uncritical citation of fraudulent, incomplete or obsolete data by making it possible for the conscientious scholar to be aware of criticisms of earlier papers.” Garfield, E (1955) „Citation Indexes for Science‟ Science, New Series, Vol. 122, No. 3159, pp. 108-111
    4. 4. Journal Ranking 1955 Eugene Garfield - the idea of measuring the “impact” of journal articles using citations
    5. 5. Journal Ranking 1955 Eugene Garfield - the idea of measuring the “impact” of journal articles using citations 1960s Science Citation Index developed to highlight “formal, explicit linkages between papers that have particular points in common”
    6. 6. Journal Ranking 1955 Eugene Garfield - the idea of measuring the “impact” of journal articles using citations 1960s Science Citation Index developed to highlight “formal, explicit linkages between papers that have particular points in common” 1975 Journal Citation Reports – uses WoS data to rank journals within disciplines
    7. 7. Journal Citation Reports • JCRs – annual publication of journals, their impact factors and other metrics. • 10,675 titles had JIFs in 2011 Science and Social Science editions • A journal that is cited once, on average, for each article published has a JIF of 1.
    8. 8. Journal Citation Reports
    9. 9. Citations in 2012 to all articles published by Journal X in 2011 & 2010 Number of articles that were published in Journal X in 2011 & 2010 Journal X‟s 2012 impact factor = Journal Impact Factor
    10. 10. Citations in 2012 (in journals indexed in Web of Science) to all articles published by Journal X in 2011 & 2010 Number of articles (deemed to be citable by Web of Science) that were published in Journal X in 2011 & 2010 Journal X‟s 2012 impact factor = Journal Impact Factor
    11. 11. Journal Ranking (2008) Taylor and Francis LibSite Newsletter, issue 9. p. 2
    12. 12. Journal Ranking (2008) Taylor and Francis LibSite Newsletter, issue 9. p. 2 The average JIF in one discipline will vary considerably to that in another.
    13. 13. Journal Ranking (2008) Taylor and Francis LibSite Newsletter, issue 9. p. 2 You cannot compare a journal in one field of study to that in another field of study based upon their respective JIF.
    14. 14. Journal Ranking (2008) Taylor and Francis LibSite Newsletter, issue 9. p. 3
    15. 15. Journal Ranking (2008) Taylor and Francis LibSite Newsletter, issue 9. p. 3 Even within a single discipline, it is difficult to make a generalisation or comparison across different subjects.
    16. 16. What you can ask… Am I more likely to attract more citations (and assumedly reach a larger audience) in a journal with a higher JIF than a journal with a lower JIF?
    17. 17. What you can ask… Has this article attracted more , or less, citations than the average number of citations an article in the same journal might attract?
    18. 18. Other journal impact metrics • Eigenfactor - http://www.eigenfactor.org/ – Web of Science data • SCImagoJR - http://www.scimagojr.com/ – Scopus data – Includes country ranking
    19. 19. Image Credits [Slide 1] Via Flickr Creative Commons, by emdot. Original available at http://www.flickr.com/photos/35237093637@N01/56156364

    ×