Moed henk

621 views

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
621
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
14
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Moed henk

  1. 1. The use of bibliometric indicators in research assessment: A critical overview Henk F. Moed Senior scientific advisor, Elsevier, Amsterdam, Netherlands
  2. 2. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  3. 3. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  4. 4. Journal impact measures are no good predictors of an individual paper’s actual citation impact Partly based on International Mathematical Union’s Report ‘Citation Statistics’ (2008)
  5. 5. Normal vs. skewed distributions 0 5 10 15 20 25 30 35 0 15 35 55 75 95 115 135 155 175 195 215 Length (cm) %Persons Boys (Mean length=95 cm) Players (Mean length=185 cm) 0 10 20 30 40 50 60 70 80 0 1 2 3 4 5 6 7 8 Nr Cites %Papers PAMS (JIF=0.43) TAMS (JIF=0.85) Adults Boys Adults
  6. 6. What is the probability that ....... a randomly selected boy is at least as tall as a randomly selected adult? Av. Length: Boys 85 cm; Adults: 185 cm Almost zero 62 %a randomly selected PAMS paper is cited at least as often as a randomly selected TAMS paper? JIF: PAMS: 0.43; TAMS: 0.85
  7. 7. Journal metrics should account for ‘free’ citations (and usage)
  8. 8. Base journal metric Citations to all docs # Citable docs
  9. 9. Citable vs. non-citable docs Citable documents “non-citable” documents Articles Letters Reviews Editorials Discussion papers
  10. 10. The problem of “free” citations - 1 Cites Docs + + + + + + + + + +
  11. 11. The problem of “free” citations - 2 Cites Docs + + + + + + + “Free” Citations
  12. 12. SNIP corrects for disparities in citation potential among fields
  13. 13. A journal’s ‘Raw’ Citation Impact ‘Topicality’ of its subject field SNIP: Base concept SNIP =
  14. 14. How is a field’s ‘topicality’ measured? Topicality Citation potential Length of cited reference lists
  15. 15. Differences in citation potential between fields Molecular Biology Mathematics Number of received citations % P a p e r s Refe- rence lists
  16. 16. A journal’s raw impact per paper Citation potential in its subject field SNIP = Journal scope, focus Database coverage peer reviewed papers only A field’s frequency & immediacy of citation Measured relative to database median
  17. 17. Citing papers Target journal papers A journal’s subject field journal’s subject field =papers citing the journal
  18. 18. Example 1 : Molec Biol vs. Mathematics Journal JIF Cit Pot SNIP (= JIF/ Cit Pot) INVENT MATH 1.5 MOLEC CELL 13.0 3.80.4 3.2 4.0
  19. 19. Example 2 : Within Mathematics Journal JIF Cit Pot SNIP (= JIF/ Cit Pot) Int J Nonlinear Sci & Num Sim 4.2 Commun Partial Different Equat 1.1 2.12.0 0.5 2.1
  20. 20. Example 3 : Social Sci vs. Biol & Med Sci Journal JIF Cit Pot SNIP J GERONTOL - A (Biol & Med Sci) 3.7 J GERONTOL - B (Psych & Soc Sci) 2.7 2.0 1.8 2.31.2
  21. 21. Strong points of SNIP • Takes into account a journal’s scope • Allows cross-subject comparisons • Is independent of an a priori subject categorization • Can be calculated for general journals • Less potential for gaming • Accounts for differences between and within journal subject ‘categories’
  22. 22. Institutional research assessment should apply indicators of actual citation impact and adequate benchmarking
  23. 23. Be careful with using the H-Index: Different citation distributions may have the same value
  24. 24. All three publication lists have a Hirsch Index of 5 30 P1 10 P2 8 P3 6 P4 5 P5 1 P6 0 P7 30 P1 10 P2 8 P3 6 P4 5 P5 4 P6 4 P7 4 P8 4 P9 100 P1 70 P2 8 P3 6 P4 5 P5 1 P6 0 P7 H=? H=? H=?5 5 5 1 2 3 4 5 6 7 8 9 Author 2Author 1 Author 3
  25. 25. Bibliometric indicators are becoming increasingly ‘informative’
  26. 26. Bibliometric indicators more and more.... Feature Example Embody ways to put numbers in context Field-normalized citation measures Take into account “who” is citing Citations weighted with impact of citing source Take into account relationship citing- cited author Impact outside the own niche; multi-disciplinarity; bridging paradigms Combine various types of indicators HR data on personnel (gender, age, funding, ...)
  27. 27. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  28. 28. University ranking positions are primarily marketing tools, not research management tools
  29. 29. Research assessment methodologies must take into account… [EC AUBR Expert Group] 1. Inclusive definition of research / output 2. Different types of research and its impacts 3. Differences among research fields 4. Type and mission of institution 5. Proper units of assessment 6. Policy context, purpose and user needs 7. The European dimension 8. Need to be valid, fair and practically feasible
  30. 30. Types of outputs (SSH) Impacts Publication/text Non-publication Scientific- scholarly Journal paper; book chapter; monograph Research data file; video of experiment Educational Teaching course book; syllabus Skilled researchers Economic Patent Product; process; device; design; image Cultural Newspaper article; Interviews; events; Performances; exhibits
  31. 31. In institutional research assessment bottom-up approaches must include data verification by evaluated authors
  32. 32. Top-down institutional analysis Select an institution’s papers using author affiliations (incl. verification) Categorize articles into research fields Calculate indicators Compare with benchmarks
  33. 33. Bottom-up institutional analysis Compile a list of researchers Compile a list of publications per researcher (incl. verification) Aggregate researchers into groups, departments, fields, etc. Calculate indicators; compare with benchmarks
  34. 34. Metrics provides insight into global or systemic patterns
  35. 35. Gini index of disciplinary specialization Gini = 0.0 Gini = 0.27 Gini = 0.52 Gini = 0.70 Data for a general, a poly-technical and a specialized university
  36. 36. Relative Citation Rate (RCR) The average citation rate of a unit’s papers ÷ world citation average in the subfields in which the unit is active Corrects for differences in citation practices among fields, publication years and type of article
  37. 37. Specialized universities perform in their fields of specialization less well than general institutions do Data: Scopus / Scimagoir (n=1,500) Data: Scopus / Scimagoir (n=1,500) Specialized General High Low
  38. 38. No linear correlation between a country’s institutional concentration and its citation impact Data:Scopus/ Scimago
  39. 39. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  40. 40. Metrics can contribute to keeping the peer review process honest
  41. 41. Case study: A national Research Council • Proposals evaluated by committees covering a discipline • Reports from external referees • Committee members among applicants
  42. 42. Affinity applicants – Committee 0 Applicants are/were not member of any Committee 1 Co-applicant is/was member of a Committee, but not of the one evaluating 2 First applicant is/was member of a Committee, but not of the one evaluating 3 Co-applicant is member of the Committee(s) evaluating the proposal 4 First applicant is member of the Committee(s) evaluating the proposal
  43. 43. For 15 % of applications an applicant is a member of the evaluating Committee (Affinity=3, 4) 0 10 20 30 40 50 60 70 %APPLICATIONS AFFINITY APPLICANTS-COMMITTEE Projects 63.2 10.2 11.5 5.9 9.1 0 1 2 3 4
  44. 44. Probability to be granted increases with increasing affinity applicants-Committee 30 40 50 60 70 80 %GRANTEDAPPLICATONS AFFINITY APPLICANTS-COMMITTEE Projects 37.0 46.9 60.1 62.6 74.0 0 1 2 3 4
  45. 45. Logistic regression analysis: Affinity Applicant-Committee has a significant effect upon the probability to be granted MAXIMUM-LIKELIHOOD ANALYSIS-OF-VARIANCE TABLE (N=2,499) Source DF Chi-Square Prob ------------------------------------------------------------- INTERCEPT 1 18.47 0.0000 Publ Impact applicant 3 26.97 0.0000 ** Rel transdisc impact applicant 1 0.29 0.5926 Affinity applicant-committee 2 112.50 0.0000 ** Sum requested 1 45.47 0.0000 ** Institution applicant 4 25.94 0.0000 ** LIKELIHOOD RATIO 199 230.23 0.0638
  46. 46. The future of research assessment exercises lies in the intelligent combination of metrics and peer review
  47. 47. Intelligent combination of ‘metrics’ and peer review • Policy makers let the type of peer review depend upon the outcomes of a bibliometric study • Peer committees use citation analysis for initial rankings and explicitly justify why their final judgments deviate • Metrics are used to assess peer review processes
  48. 48. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  49. 49. CINon- CI Non- CI CI Citing/Source Cited/Target ? %? % Coverage of journal-based citation index (CI)
  50. 50. CINon- CI Non- CI CI Citing/Source Cited/Target ± 80%± 20% Science
  51. 51. CINon- CI Non- CI CI Citing/Source Cited/Target ± 20%± 80% Humanities
  52. 52. CI coverage by field EXCELLENT (>80%) GOOD (60-80%) FAIR (40-60%) MODERATE (<40%) Biochem & Mol Biol Appl Phys & Chem Mathematics Other Soc Sci Biol Sci – Humans Biol Sci – Anim & Plants Economics Humanities & Arts Chemistry Psychol & Psychiat Engineering Clin Medicine Geosciences Phys & Astron Soc Sci ~ Medicine Journals Books, proceedings
  53. 53. Options for creating a comprehensive database of research outputs in social sciences & humanities
  54. 54. Option Example/Case 1 Combine existing SSH bibliographies CSA-Illumina 2 Create new SSH databases Iberian Citation Index 3 Expand existing citation indexes WoS, Scopus 4 Explore Google Scholar; Book Search 5 Combine output registration systems MAETIS (NL) 6 Citation index from repositories Book Citation Index Project 7 Electronic Library Catalogues WorldCat
  55. 55. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  56. 56. Users and producers of metrics should be alert on ‘manipulation’
  57. 57. Effects of editorial self-citations upon journal impact factors [Reedijk & Moed, J. Doc., 2008] • Editorial self-citations: A journal editor cites in his editorials papers published in his own journal • Focus on ‘consequences’ rather than ‘motives’
  58. 58. Case: ISI/JCR Impact Factor of a Gerontology Journal (published in the journal itself) 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 2000 2001 2002 2003 2004 IMPACT FACTOR YEAR CITESPER'CITABLE'DOC
  59. 59. Decomposition of the IF of a Gerontology journal 0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 4.5 5.0 2000 2001 2002 2003 2004 IMPACT FACTOR YEAR CITESPER'CITABLE'DOC Editorial self citations Free citations
  60. 60. One can identify and correct for the following types of strategic editorial behavior • Publish ‘non-citable’ items • Publish more reviews • Publish ‘top’ papers in January • Publish ‘topical’ papers (with high short term impact) • Cite your journal in your own editorials • Excessive journal self-citing
  61. 61. Contents 1 Beyond journal impact factor and H-index 2 University rankings have a limited value 3 Combine indicators and peer review 4 Social sciences deserve special attention 5 Indicators can be manipulated 6 Explore usage-based indicators
  62. 62. Analogy Model Formal use Informal use (Collections of) publishing authors (Collections of) users Citing a document Retrieving the full text of a document Article User session Author’s institutional affiliation User’s account name Number of times cited Number of times retrieved as full text
  63. 63. Age distribution downloads vs. citations [Tetrahedron Lett, ScienceDirect; Moed, JASIST, 2005] 0 4 8 12 16 20 1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 AGE (MONTHS) % SD USES CITATIONS Downloads Citations % Age (months)
  64. 64. Ageing downloads vs. citations: Two factor vs. single factor model 0.01 0.1 1 10 100 0 4 8 12 16 20 24 28 32 36 40 44 48 52 56 60 64 68 72 76 80 84 88 Age (months) Uses Downloads Observed Downloads Computed Downloads Singular Points Citations Observed Citations Computed % Age (months) Downloads Citations
  65. 65. More downloads more citations or More citations more downloads?
  66. 66. Citations lead to downloads [Moed, J. Am Soc Inf Sci Techn, 2005] 1 10 100 1000 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 AGE PAPER A (MONTHS) DOWNLOADS A B (B cites A) C (C cites A and B) Paper A published Paper B published; it cites A Download of A increases Paper C published; it cites A and B
  67. 67. Downloads and citations relate to distinct phases in scientific information processing .... but (many) more cases must be studied
  68. 68. Thank you for your attention!

×