• Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,220
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
18
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Ten Highest ranking (on IF-values) review journals

Transcript

  • 1. Application of bibliometric analysis Thed van Leeuwen CWTS Leiden University DAIR-Workshop, 5 november 2008
  • 2. Contents
    • Introduction of CWTS and bibliometrics
    • Part I Various databases
      • Differences between Web of Science and Scopus
      • Adequacy of citations indexes
    • Part II Various indicators one can encounter
      • ISI Impact Factors: calculation and validity
      • International rankings
      • The H-index and its limitations
    • Part III Various methodological issues
      • Journal & field normalization
      • Citation Windows & Impact Measurement
      • Typology of the studies
      • Methodology of the studies
  • 3. Introduction of CWTS and bibliometrics
  • 4. Introduction of CWTS
    • CWTS is a research institute at Leiden University Faculty of Social Sciences.
    • CWTS is/was based mainly on contract research, but still produces roughly 10 papers in scientific journals per year.
    • Currently in a transformation process, based on a mix of public funding and contract research, focused on both fundamental research in the field and service contracts.
  • 5. The work of CWTS
    • CWTS conducts from the early 1990’s large scale bibliometric research performance analyses that accompany research assessments by review committees
    • Many of these studies had a disciplinary character, initiated by the VSNU.
    • A revision of the evaluation protocol (SEP) in 2003 caused a shift of the analyses to university- or institute-initiated studies.
  • 6. Introduction of bibliometrics
    • Bibliometrics can be defined as the quantitative analysis of science and technology performance and the cognitive and organizational structure of science and technology.
    • Basic for these analyses is the scientific communication between scientists through (mainly) journal publications .
    • Key concepts in bibliometrics are output and impact , as measured through publications and citations .
    • Important starting point in bibliometrics: scientists express, through citations in their scientific publications, a certain degree of influence of others on their own work.
    • By large scale quantification, citations indicate influence or (inter)national visibility of scientific activity .
  • 7. CWTS data system
    • CWTS has a full bibliometric license from Thomson Reuters Scientific to conduct evaluation studies using the Web of Science
    • Our database covers the period 1981-2007.
    • Some characteristics:
      • 27.000.000 publications;
      • 500.000.000 million citation relations between source papers
      • 48.000.000 authors (incl variations)
      • 28.000.000 million addresses, some 90% cleaned up over the last 10 years.
      • Contains reference sets for journal and field citation data
  • 8. Part I: Various databases
  • 9. Differences between Citation Indexes: implications for bibliometric studies Martijn S. Visser and Henk F. Moed
  • 10. Multidisciplinary Citation Indexes
    • Web of Science
      • since 1963, formerly produced by ISI
      • ca. 9,000 Journals are indexed
    • Scopus
      • launched by Elsevier in 2004
      • ca. 15,000 journals, conf papers and other
    • Google Scholar
      • launched in 2004
      • coverage unclear
  • 11. Disciplinary distribution of journals in WoS
    • Roughly 5.000 journals from natural-, life-, medical- and technical sciences.
    • Roughly 2.500 journals from the social- and behavioral sciences.
    • Roughly 1.500 journals from the humanities.
  • 12. Contents
    • Comparing Web of Science and Scopus on a paper by paper basis
    • RAE 2001: Coverage differences in a practical situation
    • Implications for bibliometric studies: case study in the field of oncology (SCImago)
  • 13. Comparing WoS & Scopus
    • Time period: 1996 – 2006
    • Snapshot in time
    • Only citeable documents
    • Matching algorithm
  • 14. Scopus Matching WoS with Scopus on a paper by paper basis Intersection 9.4 M Web of Science Papers 1996 – 2006: In Scopus 14.7 M In WoS 10.5 M Overall overlap 59% WoS overlap with Scopus 90% Scopus overlap with WoS 64%
  • 15. I: From a WoS perspective
    • To what extent are citeable documents (articles, letters, notes and reviews) in journals processed for the WoS covered by Scopus?
  • 16. Scopus coverage of WoS papers increases over time
  • 17. Incomplete coverage of WoS journals decreases over time % WoS Journals In 2005 82% of WoS journals were completely covered in Scopus In1996 55% of WoS journals were completely covered in Scopus WoS journals not covered by Scopus at all
  • 18. II: From a Scopus perspective
    • To what extent are citeable documents (articles, notes, letters, reviews, conference papers) in sources processed for the Scopus not covered by Web of Science? (=Scopus Surplus)
  • 19. NATURAL AND LIFE SCIENCES
  • 20. ENGINEERING SCIENCES, SOCIAL SCIENCES & HUMANITIES
  • 21. III: From an external perspective
    • To what extent are documents in an external file processed for the WoS and/or Scopus?
  • 22. Coverage differences in a practical case: 2001 RAE papers
    • Up to 4 publications of every active faculty staff member
    • Time Period 1994/1996 – 2000
    • Papers assigned to units of assessment
  • 23. Only small overall coverage differences Domain Total nr Papers in Wos in Scopus +/- Science 95,056 84.1% 84.4% +0.3 Mathematics 6,634 81.8% 80.1% -1.7 Social Science and Humanities 91,324 24.9% 25.9% +1.0
  • 24. Coverage differences for Units of Assessments (Sciences)
    • Scopus –WoS > 3%
      • Nursing
      • Clinical Dentistry
      • Civil Engineering
      • Other Studies and professions allied to Medicine
      • Computer Science
      • Mineral and Mining Engineering
    • Wos – Scopus > 3%
      • Pharmacology
      • Food Science and Technology
      • Physics
      • Anatomy
  • 25. Why are the differences relatively small?
    • ‘ Best’ papers are more likely to be published in high impact journals which tend to be processed by both indexes
    • British academics may not often use sources solely processed by Scopus
  • 26. Need to characterize the surplus coverage of Scopus
    • Geographical location
    • Language
    • References
    • Citedness
  • 27. Case study for journals in Oncology
    • Cooperation between SCImago and CWTS (Carmen Lopez Illescas )
    • Comparison between Scopus and Web of Science
    • Comparison on a journal-by-journal basis
  • 28. Overlap from a WoS perspective: WoS cancer categories No. WoS journals WoS journals in Scopus Oncology 126 112 (89%) In canc catg 14 (11%) In other catg 0 (0%) Not in Scopus Total no. journals 126
  • 29. Overlap from a Scopus perspective: Scopus cancer catgs No. Scopus journals In both categs Scopus journals in WoS Oncology 167 75 112 (48%) Canc catgs 13 (6%) Other catgs Cancer Research 139 106 (46%) Not in WoS Total no. journals 231
  • 30. Characterization of Scopus surplus: Language and Publisher country Data source: Ulrich Journals in WoS and Scopus (n=112) Journals in Scopus but not in WoS (n=106) English language 94% 72% Number of different languages 4 15 Publisher in: USA; UK; Netherlands 78% 51%
  • 31. Characterization of Scopus surplus: other journal properties Data source: Ulrich Journals in WoS and Scopus (n=112) Journals in Scopus but not in WoS (n=106) Nº % Nº % Started year bet. 1996-2006 17 15.2 57 53.8 Media online free 1 0.9 10 9.4 Refereed 108 96.4 72 67.9 Frequency 12 or more a year 69 61.6 31 29.2
  • 32. Main characteristics Scopus surplus
    • More journals in non-English languages
    • More recently established
    • Less often refereed
    • Relatively low impact factors
  • 33. Implications for bibliometric studies
    • Web of Science tends to select only journals with sufficient high impact (research front)
    • Scopus tends to be more representative of the total of scientific literature
    • The position of the Western World in Scopus is probably less dominant
  • 34. Strong relationship between nr WoS and Scopus papers for the 50 most productive countries
  • 35. Average Citation rate in WoS and Scopus for the 50 most productive countries
  • 36. Negative relationship between number of papers in Scopus surplus and average citation rate
  • 37. Effect of Scopus surplus:
    • Countries that profit most in terms of percentage of published documents tend to show a decline in their average citation rate
    • Keeping in mind the RAE outcomes, the effect of the extension is also that the countries publishing relatively often in non-English journals, show a decline in their average citatioin rate
    • Conclusion: ‘ More is not necessarily better !’
  • 38. Discussion
    • Citation indexes (will) adopt a more inclusive coverage policy in which citation impact is less important as a criterion for selection.
    • This will have implications
      • for the way bibliometric assessments of research performance have to be carried out
      • for the interpretation of bibliometric indicators and rankings derived from these databases
  • 39. Discussion
    • Should bibliometric studies aim for widest possible coverage?
    • - It depends on what you want to measure.
    • How should one deal with possible bias as a result of changing coverage policies?
    • Define sub-universes (journal sets) of publications and citations (e.g. national /international)
  • 40. Adequacy of citation indexes : implications for bibliometric studies
  • 41. How to tackle this issue ?
    • We conduct analyses on the adequacy of the citation indexes across disciplines based on reference behavior of researchers themselves.
    • The degree of referring towards other indexed literature indicates the importance of journal literature in the scientific communication process.
  • 42. The medical & Life sciences
  • 43. The natural sciences
  • 44. The technical sciences
  • 45. The social– and behavioral sciences
  • 46. The humanities
  • 47. Overall WoS coverage by main field EXCELLENT (> 80%) VERY GOOD (60-80%) GOOD(40-60%) Biochem & Mol Biol Appl Phys & Chem Mathematics Biol Sci – Humans Biol Sci – Anim & Plants Economics Chemistry Psychol & Psychiat Engineering Clin Medicine Geosciences MODERATE (<40 %) Phys & Astron Soc Sci ~ Medicine Other Soc Sci Humanities & Arts
  • 48. Conclusions on adequacy issue
    • We can clearly conclude that the application of bibliometric techniques, solely based on WoS (but very likely also Scopus) will not be valid for some of the ‘soft’ fields in the social sciences and the humanities.
    • That is why the tool box has to be extended !
  • 49. Part II: Bibliometric indicators one encounters in the field
  • 50. Some basic indicators are …
    • P : number of publications in journals processed for the Web
    • of Science.
    • C : number of received citations, excl. self-citations.
    • CPP: mean number of citations per publication, excl. self-
    • citations
    • Pnc: percentage of the publications not cited (within a
    • certain time-frame !!!)
    • % SC : percentage self-citations related to an output set.
  • 51. ISI Impact Factors: calculation and validity
  • 52. Methodology: ISI’s classical IF
    • The ISI Impact Factor (IF) is defined as the number of citations received by a journal in year t, divided by the number of citeable documents in that same journal in the years t-1 and t-2,
    • Or, as a Formula:
    Citations in year t Number of ‘citeable documents’ in t -1 & t -2
  • 53. Practical exercise on IF
    • Here we get to the first practical exercise of the afternoon
    • Read the Science commentary on Impact Factors.
    • Calculate the Impact Factor for ‘the Lancet’
    • Discuss the results.
  • 54. Share ‘citations-for-free’ for The Lancet
    • Publications Citations
    • 90+91 1992
    • Article 784 2986
    • Note 144 593
    • Review 29 232
    • Sub-total 957 (a) 7959 (b)
    • Letter 4181 4264
    • Editorial 1313 905
    • Other 1421 909
    • Total 7872 14037 (c)
    • ISI Method :
    • Citations in 2000 .
    • Citeable documents in ‘98 and ‘99
    • 14037 (c)
    • 957 (a)
    IF=14.7
    • CWTS Method :
    • Citations to Art/Not/Rev in 2000 .
    • Art/Not/Rev in ‘98 and ‘99
    • 7959 (b)
    • 957 (a)
    • Citations to Art/Let/Not/Rev in 2000 .
    • Art/Let/Not/Rev in ‘98 and ‘99
    • 7959+4264 .
    • 957+4181
    IF=8.3 IF=2.4
  • 55. ISI Impact Factors
    • From 1995 onwards CWTS has analyzed the uses and validity ISI Journal Impact Factor (IF).
    • Most important points of criticism were:
      • Calculated erroneously .
      • Not sensitive for the composition of the journal in terms of the document types.
      • Not sensitive for the science fields a journal is attached to …
      • Based on too short ‘citation windows’ .
  • 56. Distribution of citations used for the calculationof the IF value of The Lancet
    • The IF-score of The Lancet is seriously ‘overrated’ by the scientific ‘audience’ of the journal.
    • The red area indicates citations ‘for free’, while the blue area indicates ‘correct citations ’
  • 57. Impact Factoren voor Br. J. Clin. Pharm. en Clin. Pharm. & Ther.
    • The graph shows the correct and erroneous impact factors of BJCP and CPT
    • In the case of CPT , citations to published meeting abstracts are included, while BJCP has stopped publishing of meeting abstracts !
  • 58. Document types and fields Field Journal IF JFIS The IF is for ‘02, JFIS covers ‘98-‘02 IMMUNOLOGY ANN REV IMMUNOL 50.49 1 5.18 1 BIOCHEM & MOLECULAR BIOL ANN REV BIOCHEM 34.61 1 4.10 3 PHARMACOL & PHARMACY PHARMACOLOGICAL REV 27.74 1 4.75 1 CELL BIOL ANN REV CELL & DEVELOPM BIOL 27.53 1 1.72 13 DEVELOPMENTAL BIOL ANN REV CELL & DEVELOPM BIOL 27.53 1 1.72 3 PHYSIOLOGY PHYSIOLOGICAL REV 24.82 1 3.18 1 CELL BIOLOGY NATURE REV MOL CELL BIOL 22.21 4 2.76 8 ENDOCRINOL & METABOLISM ENDOCRINE REV 21.98 1 2.87 1 NEUROSCIENCES ANN REV NEUROSCIENCE 21.89 1 3.12 4 PHYSICS REV MODERN PHYSICS 20.14 1 5.02 1 CHEMISTRY CHEMICAL REV 19.67 1 2.89 2
  • 59. Fields and Citation windows
  • 60. Citation measurement of IF
    • 1999 2000 2001 2002 2003 2004 2005 2006
    1999 2000 2001 2002 2003 2004 2005 2006 1999 2000 2001 2000 2001 2002 2001 2002 2003 2002 2003 2004 2003 2004 2005 2004 2005 2006 2005 2006 2006
  • 61. CWTS answer to the problems of the IF
    • This indicator is the JFIS , the J ournal-to- F ield I mpact S core .
    • The JFIS solves the problems of the Impact Factor, as
      • the calculation of JFIS is based on equally large entities,
      • document types are taken into account ,
      • JFIS is field-normalized, and finally,
      • based on longer citation windows (1-4 years)
  • 62. Citation measurement of JFIS Citation-window 1999 2000 2001 2002 2003 2004 2005 2006 1999 2000 2001 2002 2003 2004 2005 2006 1999 2000 2001 2002 2000 2001 2002 2001 2002 2002 2000 2001 2002 2003 2001 2002 2003 2002 2003 2003 2001 2002 2003 2004 2002 2003 2004 2003 2004 2004 2002 2003 2004 2005 2003 2004 2005 2004 2005 2005 2003 2004 2005 2006 2004 2005 2006 2005 2006 2006
  • 63. Practical exercise on journal impact measures
    • Here we get to the second practical exercise of the afternoon
    • Consider the application of journal impact measures in evaluation procedures, and try to think of possible advantages/disadvantages.
    • Discuss the results.
  • 64.
    • Topic : within the Netherlands, research performance evaluation studies are performed on a regular basis .
    • The Dutch Pediatrics Society (NVKG) evaluated the Dutch pediatric centers on the basis of mean IF values related to their output.
    • CWTS was asked to benchmark this method, in order to indicate possible flaws of the used method .
  • 65. Results Case II: Different measures, different conclusions --------------------------------------------------------------------------------------------------------------------------------------------------------- AMC WKZ ERAS AZL --------------------------------------------------------------------------------------------------------------------------------------------------------- ENDOCRINOLOGY 28 / 0.71 46 / 1.25 64 / 1.25 23 / 0.59 GASTROENTEROL 35 / 1.35 IMMUNOLOGY 31 / 1.78 82 / 1.10 46 / 1.10 ONCOLOGY 61 / 1.23 42 / 1.06 METABOLISM 178 / 0.80 72 / 0.87 23 / 0.53 --------------------------------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------------------------------ AMC WKZ ERAS AZL ----------------------------------------------------------------------------------------------------------------------------------------------------- ENDOCRINOLOGY 28 / 0.38 46 / 1.22 64 / 1.25 23 / 0.73 GASTROENTEROL 35 / 0.98 IMMUNOLOGY 31 / 1.22 82 / 1.17 46 / 0.97 ONCOLOGY 61 / 1.40 42 / 1.09 METABOLISM 178 / 0.88 72 / 0.88 23 / 0.27 ----------------------------------------------------------------------------------------------------------------------------------------------------- ------------------------------------------------------------------------------------------------------------------------------------------------------ AMC WKZ ERAS AZL ------------------------------------------------------------------------------------------------------------------------------------------------------ ENDOCRINOLOGY 28 / 1.86 46 / 2.23 64 / 2.21 23 / 1.22 GASTROENTEROL 35 / 2.64 IMMUNOLOGY 31 / 4.38 82 / 3.24 46 / 2.94 ONCOLOGY 61 / 2.39 42 / 2.11 METABOLISM 178 / 2.36 72 / 2.06 23 / 1.45 --------------------------------------------------------------------------------------------------------------------------------------------------- - Output vs. IF Output vs. JFIS Output vs. Actual Field-Normalized Impact
  • 66. International rankings of university performance
  • 67. Various academic rankings
    • ARWU or Shanghai Ranking
    • Thes Ranking
    • CHE Excellence Ranking
    • CWTS University Ranking
  • 68. Composition of Shanghai ranking
    • Nobel prizes and field medals by alumni 10% .
    • Nobel prizes and field medals by staff 20%
    • Highly cited staff in 21 disciplines 20%
    • Articles published in Nature & Science 20%
    • Articles published in citation indexes 20%
    • Per capita performance on those indicators 10%
  • 69. Composition of THES ranking
    • Academic peer review 40% .
    • Employer review 10%
    • Faculty – Student ratio 20%
    • Citation per faculty 20%
    • International faculty 5%
    • International students 5%
  • 70. Composition of CHE ranking
    • Size indicator : output volume in citation indexes.
    • Perception indicator : citations (in relation to an international standard)
    • Beacon indicator: Number of often-cited staff & Nobel prize winners at the university
    • Europe indicator : number of projects in the Marie Curie research promotion programme of the EU
    • No weights are indicated, rankings are applied on four fields (biology, chemistry, mathematics, & physics), data delivered by CWTS.
  • 71. Important CWTS standard indicators
    • CPP/JCSm : ratio between real, actual impact, and mean journal impact.
    • CPP/FCSm : ratio between real, actual impact, and mean field impact.
    • JCSm/FCSm: ratio between journal impact, and field impact, indicative for the ‘quality’ of the journal package in the field
  • 72. Composition of CWTS ranking
    • Yellow ranking by size, the number of publications .
    • Green ranking by size-independent, field-normalized citation impact (‘Crown indicator’).
    • Orange ranking by the size-dependent ‘brute force’ impact indicator, the multiplication of P with the university’s field-normalized average impact
    • Blue ranking by the ‘simple’ citations-per-publication indicator ( CPP ) 0%
    • Pink ranking by the ‘simple’ citations-per-publication indicator ( CPP ) for the top-50 ranking institutes on size !
  • 73. Practical exercise on rankings
    • Here we get to the third practical exercise of the afternoon
    • Which disadvantages can you think of, when considering the indicators used in the various rankings ?
    • Discuss the results.
  • 74. The H-Index and its limitations
  • 75. The H-Index, defined as …
    • The H-Index is the score that indicates the position at which a publication in a set, the number of received citations is equal to the ranking position of that publication.
    • Idea of an American physicist, J. Hirsch, who published about this index in the Proc. NAS USA.
  • 76. Examples of Hirsch-index values
    • Environmental biologist, output of 188 papers, cited 4,788 times in the period 80-04.
    • Hirsch-index value of 31
    • Clinical psychologist, output of 72 papers, cited 760 time sin the period 80-04.
    • Hirsch-index value of 14
  • 77. Problems with the H-Index
    • For serious evaluation of scientific performance, the H-Index is as indicator not suitable, as the index:
      • Is insensitive to field specific characteristics (e.g., difference in citation cultures between medicine and other disciplines).
      • Does not take into account age and career length of scientists, a small oeuvre leads necessarily to a low H-Index value.
  • 78.
    • Actual versus field normalized impact (CPP/FCSm) displayed against the output.
    • Large output can be combined with a relatively low impact
  • 79.
    • H-Index displayed against the output.
    • Larger output is strongly correlated with a high H-Index value.
  • 80. Part III Methodological issues
  • 81. Journal & Field Normalization
  • 82. Network of publications (nodes ) linked by citations (edges ) Lower citation-density Higher citation - density e.g., applied research, e.g., basic natural social sciences medical research FCSm JCSm CPP Values for normalization
  • 83. Calculating the JCSm & FCSm
    • ----------------------------------------------------------------------------------------------
    • Type publ. Journal Journal # citations
    • year category until 1999
    • ---------------------------------------------------------------------------------------------- 
    • I review 1996 CANCER RES Oncology 17
    •  
    • II note 1997 J CLIN END Endocrinology 4
    •  
    • III article 1999 J CLIN END Endocrinology 6
    • IV article 1999 J CLIN END Endocrinology 8
    • ----------------------------------------------------------------------------------------------  
  • 84. Calculating the JCSm & FCSm 2
    • -----------------------------------------------------------------
    • CPP JCS FCS
    • -----------------------------------------------------------------
    • I 17 16.9 23.7
    •  
    • II 4 3.1 3.0
    •  
    • III 6 4.8 4.1
    •  
    • IV 8 4.8 4.1
    • -----------------------------------------------------------------
  • 85. Practical exercise on calculating CWTS normalized indicators
    • Here we get to the fourth practical exercise of the afternoon
    • How would one calculate the CWTS normalized indicators CPP/JCSm and CPP/FCSm ?
    • Show the outcomes and discuss the results.
  • 86. Calculating the JCSm & FCSm 3
    • The mean citation score is determined as:
    • 17 + 4 + 6 + 8
    • CPP = ------------------ = 8.8
    • 1 + 1 + 1 + 1
    The mean journal citation score as: (1 x 16.9) + (1 x 3.1) + (2 x 4.8) JCSm = -------------------------------------- = 7.4 1 + 1 + 2 The mean field citation score as : (1 x 23.7) + (1 x 3.0) + (2 x 4.1) FCSm = -------------------------------------- = 8.7 1 + 1 + 2 CPP / JCSm (8.8 / 7.4) = 1.19 CPP / FCSm (8.8 / 8.7) = 1.01
  • 87. Citation Windows & Impact Measurement
  • 88. Citation measurement and ‘windows’
    • Publication years, fixed citation ‘ window ’ .
    • P ublications of 199 4 , with three citation years (namely 199 4 , 199 5 , en 199 6 ), followed by 199 5 , with three years, etc.
    • Blocks of publication years with a window decreasing in length .
    • P ublications of 199 4 -199 7 , with citation window of 4 years (199 4 -199 7 ), 3 years (1995-1997) , 2 years (1996-1997), and 1 year.
  • 89. Citation measurement with ‘fixed window’
    • Citation years
    • 1994 1995 1996 1997 1998 1999 2000 2001
    1994 1995 1996 1997 1998 1999 2000 2001 1994 1995 1996 1995 1996 1997 1996 1997 1998 1997 1998 1999 1998 1999 2000 1999 2000 2001 2000 2001 2001
  • 90. Citation measurement with ‘year blocks’ Citation years 1994 1995 1996 1997 1998 1999 2000 2001 1994 1995 1996 1997 1998 1999 2000 2001 1994 1995 1996 1997 1995 1996 1997 1996 1997 1997 1995 1996 1997 1998 1996 1997 1998 1997 1998 1998 1996 1997 1998 1999 1997 1998 1999 1998 1999 1999 1997 1998 1999 2000 1998 1999 2000 1999 2000 2000 1998 1999 2000 2001 1999 2000 2001 2000 2001 2001
  • 91. Typology of studies
  • 92. Typology of studies
    • We distinguish various levels of analysis:
      • Macro-level , e.g. country comparison for the EU, Dutch Observatory of S&T;
      • Meso-level , e.g. disciplinary evaluation of physics research in the Netherlands;
      • Micro-level , e.g. analysis of research institutes, programmes, or groups;
      • Nano-level , e.g. analysis of individual researchers.
  • 93. Types of data collection
    • We distinguish various types of data collection:
      • Address based, e.g. country or institute comparisons. Here we select publications from the data-base starting from country or institute names;
      • Author name based , e.g. various evaluation procedures. Here we select publications from the database, the result is verified by the researchers themselves;
      • Publication list based , e.g . various evaluation procedures. For example, matching a list retrieved from Metis with the WoS.
  • 94. Methodology of studies
  • 95. Functions of applying bibliometrics
    • Using bibliometrics as a diagnostic tool, we can distinguish two main functions :
      • Mainly evaluative (e.g., studies for
      • VSNU, the Dutch Association of
      • Universities).
      • Mainly descriptive (e.g., on German
      • medical sciences, but also CWTS
      • benchmark studies).
  • 96. Goals of applying bibliometrics
    • Using bibliometrics to measure output and impact, we can distinguish two main goals :
      • Gaining insight in the research potential of entities or a complete organization.
      • Gaining insight in the past performance of entities or a complete organization.
  • 97. Choices and consequences
    • The final choice for a certain approach or type of study is mainly ‘customer driven’.
    • Depending on the goals one wants to achieve (evaluation of research (groups), description of the scientific profile of an institute or university, etc.), specific approaches fit the raised questions.
  • 98. Models of bibliometric analysis
    • In our analyses, we can roughly distinguish between two different types of models :
      • Top-Down approach
      • Bottom-Up approach
    In the next section, we will further explain these two different approaches.
  • 99. The ‘building blocks’ of an organization
      • University
      • Laboratories / research groups
      • Researchers (in these
      • laboratories)
      • Scientific publications
  • 100. Top Down Approach CWTS compilation of address-based publication set
  • 101. Bottom Up Approach Author verification of scientific publications
  • 102. Top Down Approach ( modified ) CWTS compilation of address-based publication set, verification by client
  • 103. Bottom Up Approach (modified) Client compilation of verified set of scientific publications
  • 104. Combining approach with goal Top-Down Bottom-Up Past performance Yes Yes Research potential Yes / No Yes
  • 105. Combining approach with function Top-Down Bottom-Up Descriptive Yes Yes Evaluative No Yes
  • 106. Possible results
    • Often, bibliometric results combine the scores of an organization with the distribution over fields of scientific activity.
    • A common misunderstanding is to link, mentally, the scores of a unit (that fits the name of the field) to that field.
  • 107. Research profile
    • Profile of an academic medical center
    • Impact at average level in three fields
  • 108. Dept. of Physics Dept. of Chemistry Chemistry Chemical Engineering Physics Differences between organizational units and fields C-II C-I C-IV C-III P-I P-II P-III P-IV
  • 109. Choices and consequences
    • The crucial role of the verification of publication output is shown.
    • Depending on the goals and functions related to the application of bibliometric analysis, a specific approach fits the raised questions best.
  • 110. Conclusions
    • Within a Top Down approach, evaluation is excluded as an option.
    • Within a Top Down approach, insight in research potential is only partially .
    • The Bottom Up approach provides the most insight for evaluative purposes, on both the level of past performance as well as research potential.
  • 111. Practical exercise on methodology
    • Here we get to the final practical exercise of the afternoon
    • Which disavantages/disadvantages can you think of, when considering the two different bibliometric approaches Top Down versus Bottom Up ?
    • Discuss the results.
  • 112.
    • End of the workshop
    • For questions regarding the contents of the workshop, mail to: [email_address]