Your SlideShare is downloading. ×
Ranking of Universities:  Methods, Limitations, Challenges   Middle East Technical University Ankara, December 3,  2010 An...
Leiden University,  oldest in the Netherlands, 1575,  European League of Research Universities (LERU) Leiden, historic cit...
 
 
First and major challenge: - Establish as good as possible the  research performance in the recent past  in quantifiable t...
Two methodological approaches: (1) ‘broader’ peer review >  expert survey > reputation assessment   (2)  bibliometric indi...
Conceptual problems peer review/surveys :  * Slow:  ‘old’ reputation vs. performance at the research  front now * Mix of o...
Conceptual problems bibliometric analysis :  1. Evidences of research performance such as prizes and  awards, earlier peer...
Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, econ...
WoS/Scopus sub-universe journal articles only,>1,000,000p/y Total publ universe Natural/medical sciences >>> applied, soci...
WoS/Scopus sub-universe journal articles only,> 1,000,000p/y Refs  >  non-WoS Total publ universe non-WoS publ: Books Book...
Citing publications C(A) P(A) Cited publications C(f) P(f) C(A)/P(A)=  CPP C(f)/P(f)= FCS Field-specific normalization C(A...
Ja1 Ja2 Ja3 Ja4 Jb1 Jb2 Jb3 Jb4 Jb5 Ja2 Ja1 Ja4 Ja3 Bibliogr.coupl.
EC j label 1 http://www.neesjanvaneck.nl/journalmap   FCS JCS Economics journals 2008 Journal map, n=332 JCS/FCS
Economics journals 2008 Journal density map, n=332
Fa1 Fa2 Fa3 Fa4 Fb1 Fb2 Fb3 Fb4 Fb5 Fa2 Fa1 Fa4 Fa3 Bibliogr.coupl.
Enhance visibility of engineering, social sciences and humanities by proper publication-density normalization Size: World ...
Urban Studies 2006-2008 Concept map, n=700
Urban Studies 2006-2008 Concept density map n=700 More stable, more comprehensible
F(t1) F(t2) Knowledge dynamics described by a continuity equation describing  the change of a quantity inside any region i...
University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > De...
 
Indicators divided into 5 categories, with fraction of final ranking score Teaching:  0.30  Research:  0.30 Citations:  0....
 
 
 
Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation   0.40  Employer Reputation...
 
 
Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staf...
 
 
Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring  2010: ...
Main Features of Leiden Ranking 2010: *  Best possible  definition  of universities by applying the CWTS  address unificat...
P=2004-2008, C=2004-2009           top-250 largest ranked by CPP                     CPP/     CPP MNCS2 P FCSm UNIV LAUSAN...
Large, Broad European   University Focus: top 25 % in  publication output  and citation impact Top 25% Bottom 25% Impact r...
Smaller Specialized European   University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in...
Based on this updated and extended  ranking approach: Benchmarking studies of universities   Comparison of the ‘target’-un...
 
 
 
Current and recent benchmark projects Manchester, Leiden, Heidelberg, Rotterdam, Copenhagen, Zürich,  Lisbon UNL, Amsterda...
EU Multi-Ranking Universiteit Leiden
Leiden University
Delft University of Technology
METU Ankara
Turkey 2001-2004
Turkey 2005-2008
Thank you for your attention more information:  www.cwts.leidenuniv.nl
 
 
E x ternal WoS coverage   Example: Uppsala 2002-2006
More about rankings and benchmarking
Ranking by Top-10%
 
 
Aarhus University Size: University specific
Aalborg University
 
 
CPP/ FCSm CPP/ FCSm Diseases of the Neurosystem Dept Output in Fields   Dept Impact from Fields Knowledge use by very high...
CPP/FCSm < > MNCS1, 2
Figure 1  Relation between CPP/FCSm and MNCS 1  for 158  research groups Figure 2 Relation between CPP/FCSm and MNCS 2  in...
Figure 3  Relation between CPP/FCSm and MNCS 1  for the 365 largest universities  in the world Figure 4.  Relation between...
Figure 5  Relation between CPP/FCSm and MNCS 1  for the 58  largest  countries Figure 6  Relation between CPP/FCSm and MNC...
Figure 7  Relation between CPP/FCSm and MNCS 1  for all WoS (nonAH)  journals Figure 8  Relation between CPP/FCSm and MNCS...
Only journals with  CPP/FCSm and MNCS1 < 2.5 Figure 9  Relation between CPP/FCSm and MNCS 1  for all WoS (nonAH)  journals...
Figure 11  Relation between CPP/FCSm and  MNCS 1  for 190 researchers (large UMC)  with  > 20 WoS publications 1997-200 6 ...
Performance assessment
Application of Thomson-WoS Impact Factors for research performance evaluation is  irresponsible * Much too short  citation...
<ul><li>Basic Performance Indicators </li></ul><ul><li>P Ouput : number of publications in WoS-covered  journals </li></ul...
<ul><li>CWTS Key Research Performance Indicators: </li></ul><ul><li>JCSm/FCSm Relative impact of the  used journal set </l...
 
 
 
 
CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as  a whole 1996 2005
CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%,  half of which  0 cit  B: 10% C: 15% D: 10% E: 19%...
Publications from 1991,….1995   time lag & citation window
CPP/FCSm  <  0.80: performance significantly below  internat. average, class A; 0.80 <  CPP/FCSm  <  1.20:  performance ab...
University departments fields
Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now bre...
<ul><li>NL national chemistry evaluation: 10 Universities with major chemistry departments </li></ul><ul><li>One of these ...
cluster Field = set of publications with  thematic/field-specific  classification codes again for new, emerging often inte...
Mesh delineation vs. journal-classification Problem of the  ‘right’ FCSm….. FCSm FCSm ISI j-category  based PubMed  classi...
 
 
 
Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations.  Quit...
 
 
Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size.  The higher ...
 
Finding 3: The average research performance of university measured by crown indicator  CPP/FCSm  does not ‘dilute’ with in...
 
 
 
 
Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for t...
 
Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation densi...
<ul><li>Some conclusions: </li></ul><ul><li>Different research rankings by indicator possible, but impact remains the most...
<ul><li>Time-dependent analysis is important for monitoring strengths (and/or weaknesses…) </li></ul><ul><li>Unexpected st...
Upcoming SlideShare
Loading in...5
×

LEIDEN UNIVERSITY 2010

1,156

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,156
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "LEIDEN UNIVERSITY 2010"

  1. 1. Ranking of Universities: Methods, Limitations, Challenges   Middle East Technical University Ankara, December 3, 2010 Anthony F.J. van Raan Center for Science and Technology Studies (CWTS) Leiden University
  2. 2. Leiden University, oldest in the Netherlands, 1575, European League of Research Universities (LERU) Leiden, historic city (2th, 11 th C.), strong cultural (arts, painting) & scientific tradition one of the largest science parks in EU
  3. 5. First and major challenge: - Establish as good as possible the research performance in the recent past in quantifiable terms (objectivity, reliability); - If you succeed in quantifying performance, you can always make a ranking
  4. 6. Two methodological approaches: (1) ‘broader’ peer review > expert survey > reputation assessment (2) bibliometric indicators > performance and impact measurement On the department or research program level (1) and (2) are strongly correlated
  5. 7. Conceptual problems peer review/surveys : * Slow: ‘old’ reputation vs. performance at the research front now * Mix of opinions about research, training, teaching, political/social status
  6. 8. Conceptual problems bibliometric analysis : 1. Evidences of research performance such as prizes and awards, earlier peer review, invitations as key-note speakers, etc. are not covered However, in many cases these ‘non-bibliometric’ indicators correlate strongly with bibliometric indicators 2. Focus on journal articles: differences across disciplines
  7. 9. Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, economical and technological problems From industry From international top-groups These all f(t)!! > Sleeping Beauties
  8. 10. WoS/Scopus sub-universe journal articles only,>1,000,000p/y Total publ universe Natural/medical sciences >>> applied, social, humanities
  9. 11. WoS/Scopus sub-universe journal articles only,> 1,000,000p/y Refs > non-WoS Total publ universe non-WoS publ: Books Book chapters Conf. proc. Reports Non-WoS journals
  10. 12. Citing publications C(A) P(A) Cited publications C(f) P(f) C(A)/P(A)= CPP C(f)/P(f)= FCS Field-specific normalization C(A)/P(A) ------------ = CPP / FCSm C(f)/P(f) + doc. type normalization + no self-citations, also not in C(f)!
  11. 13. Ja1 Ja2 Ja3 Ja4 Jb1 Jb2 Jb3 Jb4 Jb5 Ja2 Ja1 Ja4 Ja3 Bibliogr.coupl.
  12. 14. EC j label 1 http://www.neesjanvaneck.nl/journalmap FCS JCS Economics journals 2008 Journal map, n=332 JCS/FCS
  13. 15. Economics journals 2008 Journal density map, n=332
  14. 16. Fa1 Fa2 Fa3 Fa4 Fb1 Fb2 Fb3 Fb4 Fb5 Fa2 Fa1 Fa4 Fa3 Bibliogr.coupl.
  15. 17. Enhance visibility of engineering, social sciences and humanities by proper publication-density normalization Size: World average Science as a structure of 200 related fields World map
  16. 18. Urban Studies 2006-2008 Concept map, n=700
  17. 19. Urban Studies 2006-2008 Concept density map n=700 More stable, more comprehensible
  18. 20. F(t1) F(t2) Knowledge dynamics described by a continuity equation describing the change of a quantity inside any region in terms of density and flow ?
  19. 21. University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > Detailed research performance analysis of a university by department ‘ top-down’ analysis: field-structure is imposed to university; > Broad overview analysis of a university by field
  20. 23.
  21. 24. Indicators divided into 5 categories, with fraction of final ranking score Teaching: 0.30 Research: 0.30 Citations: 0.325 Industry income: 0.025 International: 0.05 Fractions are based on ‘expert enthusiasm’ and confidence in data’
  22. 28. Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation 0.40 Employer Reputation: 0.10 Stud/staff: 0.20 Citations: 0.20 International: 0.10
  23. 31. Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staff, N&S, PUB: 0.20 + 0.20 + 0.20 PCP (size normaliz.) 0.10 Citations only in HiCi (but citation per staff measure)!
  24. 34. Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring 2010: 2003-2008(P)/2009(C) Autumn 2010: 2003-2009 (P and C)
  25. 35. Main Features of Leiden Ranking 2010: * Best possible definition of universities by applying the CWTS address unification algorithm * New indicators next to the standard CWTS indicators, including university-industry collaboration (autumn) * Field-specific ranking and top- indicators (15 major fields) > Benchmarking
  26. 36. P=2004-2008, C=2004-2009           top-250 largest ranked by CPP                     CPP/     CPP MNCS2 P FCSm UNIV LAUSANNE CH 12.23 1.49 6950 1.55 UNIV DUNDEE UK 11.82 1.42 4413 1.51 LONDON SCH HYGIENE & TROPICAL MEDICINE UNIV LONDON UK 11.59 1.64 4232 1.62 JG UNIV MAINZ DE 11.41 1.32 5015 1.25 UNIV OXFORD UK 11.25 1.67 26539 1.63 UNIV CAMBRIDGE UK 11.15 1.70 25662 1.63 KAROLINSKA INST STOCKHOLM SE 11.09 1.36 16873 1.34 UNIV BASEL CH 11.02 1.55 8483 1.46 ERASMUS UNIV ROTTERDAM NL 11.01 1.49 12408 1.49 UNIV GENEVE CH 10.90 1.45 9342 1.47 UNIV COLL LONDON UK 10.85 1.48 26286 1.46 IMPERIAL COLL LONDON UK 10.47 1.55 21967 1.49 UNIV SUSSEX UK 10.33 1.60 3841 1.67 UNIV EDINBURGH UK 10.25 1.54 13188 1.54 UNIV ZURICH CH 10.16 1.46 13824 1.44 LEIDEN UNIV NL 10.02 1.43 12513 1.37 QUEEN MARY COLL UNIV LONDON UK 9.92 1.44 4586 1.45 UNIV HEIDELBERG DE 9.71 1.35 15445 1.32 STOCKHOLM UNIV SE 9.55 1.43 6427 1.50 VRIJE UNIV AMSTERDAM NL 9.51 1.40 12201 1.40 HEINRICH HEINE UNIV DUSSELDORF DE 9.50 1.29 6636 1.25 UNIV DURHAM UK 9.44 1.69 5848 1.65 ETH ZURICH CH 9.41 1.63 15099 1.64 UNIV GLASGOW UK 9.34 1.41 10435 1.45 KINGS COLL UNIV LONDON UK 9.33 1.38 13680 1.33 UNIV SOUTHERN DENMARK DK 9.30 1.29 4786 1.34 MED HOCHSCHULE HANNOVER DE 9.29 1.22 5233 1.16 LMU UNIV MUNCHEN DE 9.23 1.38 16995 1.30 UNIV AMSTERDAM NL 9.18 1.41 15492 1.36 UNIV BORDEAUX II VICTOR SEGALEN FR 9.16 1.24 4354 1.22
  27. 37.
  28. 38. Large, Broad European University Focus: top 25 % in publication output and citation impact Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25%
  29. 39. Smaller Specialized European University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in Economy and related fields Among top 25 % in citation impact, but in the lower-50% of publication output ECON MATH PSYCH
  30. 40. Based on this updated and extended ranking approach: Benchmarking studies of universities Comparison of the ‘target’-university with 20 other universities at choice:
  31. 44. Current and recent benchmark projects Manchester, Leiden, Heidelberg, Rotterdam, Copenhagen, Zürich, Lisbon UNL, Amsterdam UvA, Amsterdam VU, Southampton Gent, Antwerp, Brussels VUB, UC London, Aarhus
  32. 45. EU Multi-Ranking Universiteit Leiden
  33. 46. Leiden University
  34. 47. Delft University of Technology
  35. 48. METU Ankara
  36. 49. Turkey 2001-2004
  37. 50. Turkey 2005-2008
  38. 51. Thank you for your attention more information: www.cwts.leidenuniv.nl
  39. 54. E x ternal WoS coverage Example: Uppsala 2002-2006
  40. 55. More about rankings and benchmarking
  41. 56. Ranking by Top-10%
  42. 59. Aarhus University Size: University specific
  43. 60. Aalborg University
  44. 63. CPP/ FCSm CPP/ FCSm Diseases of the Neurosystem Dept Output in Fields Dept Impact from Fields Knowledge use by very high impact groups
  45. 64. CPP/FCSm < > MNCS1, 2
  46. 65. Figure 1 Relation between CPP/FCSm and MNCS 1 for 158 research groups Figure 2 Relation between CPP/FCSm and MNCS 2 indicator for 158 research groups
  47. 66. Figure 3 Relation between CPP/FCSm and MNCS 1 for the 365 largest universities in the world Figure 4. Relation between CPP/FCSm and MNCS 2 for the 365 largest universities in the world
  48. 67. Figure 5 Relation between CPP/FCSm and MNCS 1 for the 58 largest countries Figure 6 Relation between CPP/FCSm and MNCS 2 for the 58 largest countries
  49. 68. Figure 7 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 8 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
  50. 69. Only journals with CPP/FCSm and MNCS1 < 2.5 Figure 9 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 10 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
  51. 70. Figure 11 Relation between CPP/FCSm and MNCS 1 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 6 ). Figure 12 Relation between CPP/FCSm and MNCS 2 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 8 ). 0 1 2 3 4 5 0 1 2 3 4 5 A 0 1 2 3 4 5 0 1 2 3 4 5
  52. 71. Performance assessment
  53. 72. Application of Thomson-WoS Impact Factors for research performance evaluation is irresponsible * Much too short citation window * No field-specific normalization * No distinction between document types * Calculation errors/inconsistencies nominator /denominator * Underlying citation distribution is very skew: IF-value heavily determined by a few very highly cited papers
  54. 73. <ul><li>Basic Performance Indicators </li></ul><ul><li>P Ouput : number of publications in WoS-covered journals </li></ul><ul><li>C Absolute Impact : number of citations to these publications </li></ul><ul><li>H Hirsch-index </li></ul><ul><li>CPP Output-normalized impact : Average number of cits/pub of the institute/group </li></ul><ul><li>JCSm Average number of cits/pub of the journal set used by the institute/group </li></ul><ul><li>FCSm Average number of cits/pub of all journals of a specific field in which the institute/group is active </li></ul><ul><li>p0 Percentage of not-cited publications </li></ul>
  55. 74. <ul><li>CWTS Key Research Performance Indicators: </li></ul><ul><li>JCSm/FCSm Relative impact of the used journal set </li></ul><ul><li>CPP/JCSm Internat. journal-normalized impact </li></ul><ul><li>CPP/FCSm Internat. field & doc-normalized impact </li></ul><ul><li>MNCS Internat. field & doc-normalized impact </li></ul><ul><li>alternative normalization </li></ul><ul><li>Pt/ Π t Contribution to the top-5, 10, 20,..% </li></ul><ul><li>P*CPP/FCSm Size & Impact Together: Brute Force </li></ul>
  56. 79. CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole 1996 2005
  57. 80. CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%, half of which 0 cit B: 10% C: 15% D: 10% E: 19% 1996 2005
  58. 81. Publications from 1991,….1995 time lag & citation window
  59. 82. CPP/FCSm < 0.80: performance significantly below internat. average, class A; 0.80 < CPP/FCSm < 1.20: performance about internat. average, class B; 1.20 < CPP/FCSm < 2.00: performance significantly above internat. average, class C; 2.00 < CPP/FCSm < 3.00: performance in internat. perspective is very good, class D; CPP/FCSm > 3.00: performance in internat. perspective is excellent, class E.
  60. 83. University departments fields
  61. 84. Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now breakdown into its 5 departments world average
  62. 85. <ul><li>NL national chemistry evaluation: 10 Universities with major chemistry departments </li></ul><ul><li>One of these universities has 13 chemistry research groups: </li></ul><ul><li>Bioinformatics </li></ul><ul><li>Solid state NMR </li></ul><ul><li>Theoretical chemistry </li></ul><ul><li>NMR studies of proteins </li></ul><ul><li>Supramolecular chemistry </li></ul><ul><li>Synthesis of antitumor compounds </li></ul><ul><li>Synthetic organic chemistry </li></ul><ul><li>Bio-organic chemistry </li></ul><ul><li>Organometallic chemistry </li></ul><ul><li>Chemometrics </li></ul><ul><li>Crystal growth </li></ul><ul><li>Autoimmune biochemistry </li></ul><ul><li>Heat shock proteins </li></ul>Breakdown by fields Chemistry at research group level Field (CPP/FCSm)
  63. 86. cluster Field = set of publications with thematic/field-specific classification codes again for new, emerging often interdisc. fields scientific fine-grained structure
  64. 87. Mesh delineation vs. journal-classification Problem of the ‘right’ FCSm….. FCSm FCSm ISI j-category based PubMed classification based
  65. 91. Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations. Quite remarkably, lower performance universities have a larger size-dependent cumulative advantage for receiving citations than top-performance universities.
  66. 94. Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size. The higher the average journal impact of a university, the lower the number of not-cited publications. Also, the higher the average number of citations per publication in a university, the lower the number of not-cited publications. In other words, universities that are cited more per paper also have more cited papers.
  67. 96. Finding 3: The average research performance of university measured by crown indicator CPP/FCSm does not ‘dilute’ with increasing size. The large top-performance universities are characterized by ‘big and beautiful’. They succeed in keeping a high performance over a broad range of activities. This is an indication of their overall scientific and intellectual attractive power.
  68. 101. Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for the total number of citations. For lower-performance universities, field citation density provides a cumulative advantage in citations per publication. Top universities publish in journals with higher journal impact as compared to lower performance universities. Top universities perform a factor 1.3 better than bottom universities in journals with the same average impact.
  69. 103. Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation density, and of average journal impact.
  70. 104. <ul><li>Some conclusions: </li></ul><ul><li>Different research rankings by indicator possible, but impact remains the most important </li></ul><ul><li>Different research rankings by field possible, but strong universities are strong in most of the fields (cumulative advantage by reputation) </li></ul><ul><li>Nevertheless, universities with not that high average impact may be very strong in one or a few fields: focus, ambitions </li></ul>
  71. 105. <ul><li>Time-dependent analysis is important for monitoring strengths (and/or weaknesses…) </li></ul><ul><li>Unexpected strengths may show up! </li></ul><ul><li>International collaboration is very important in reinforcing a university’s impact, this clearly underlines the importance of networks </li></ul><ul><li>A relatively large group of European universities with good overall quality and several centers of excellence: city and region give universities new opportunities (and vice versa) </li></ul>

×