Ranking of Universities: Methods, Limitations, Challenges Middle East Technical University Ankara, December 3, 2010 Anthony F.J. van Raan Center for Science and Technology Studies (CWTS) Leiden University
Leiden University, oldest in the Netherlands, 1575, European League of Research Universities (LERU) Leiden, historic city (2th, 11 th C.), strong cultural (arts, painting) & scientific tradition one of the largest science parks in EU
First and major challenge: - Establish as good as possible the research performance in the recent past in quantifiable terms (objectivity, reliability); - If you succeed in quantifying performance, you can always make a ranking
Two methodological approaches: (1) ‘broader’ peer review > expert survey > reputation assessment (2) bibliometric indicators > performance and impact measurement On the department or research program level (1) and (2) are strongly correlated
Conceptual problems peer review/surveys : * Slow: ‘old’ reputation vs. performance at the research front now * Mix of opinions about research, training, teaching, political/social status
Conceptual problems bibliometric analysis : 1. Evidences of research performance such as prizes and awards, earlier peer review, invitations as key-note speakers, etc. are not covered However, in many cases these ‘non-bibliometric’ indicators correlate strongly with bibliometric indicators 2. Focus on journal articles: differences across disciplines
Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, economical and technological problems From industry From international top-groups These all f(t)!! > Sleeping Beauties
Urban Studies 2006-2008 Concept density map n=700 More stable, more comprehensible
F(t1) F(t2) Knowledge dynamics described by a continuity equation describing the change of a quantity inside any region in terms of density and flow ?
University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > Detailed research performance analysis of a university by department ‘ top-down’ analysis: field-structure is imposed to university; > Broad overview analysis of a university by field
Indicators divided into 5 categories, with fraction of final ranking score Teaching: 0.30 Research: 0.30 Citations: 0.325 Industry income: 0.025 International: 0.05 Fractions are based on ‘expert enthusiasm’ and confidence in data’
Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation 0.40 Employer Reputation: 0.10 Stud/staff: 0.20 Citations: 0.20 International: 0.10
Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staff, N&S, PUB: 0.20 + 0.20 + 0.20 PCP (size normaliz.) 0.10 Citations only in HiCi (but citation per staff measure)!
Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring 2010: 2003-2008(P)/2009(C) Autumn 2010: 2003-2009 (P and C)
Main Features of Leiden Ranking 2010: * Best possible definition of universities by applying the CWTS address unification algorithm * New indicators next to the standard CWTS indicators, including university-industry collaboration (autumn) * Field-specific ranking and top- indicators (15 major fields) > Benchmarking
P=2004-2008, C=2004-2009 top-250 largest ranked by CPP CPP/ CPP MNCS2 P FCSm UNIV LAUSANNE CH 12.23 1.49 6950 1.55 UNIV DUNDEE UK 11.82 1.42 4413 1.51 LONDON SCH HYGIENE & TROPICAL MEDICINE UNIV LONDON UK 11.59 1.64 4232 1.62 JG UNIV MAINZ DE 11.41 1.32 5015 1.25 UNIV OXFORD UK 11.25 1.67 26539 1.63 UNIV CAMBRIDGE UK 11.15 1.70 25662 1.63 KAROLINSKA INST STOCKHOLM SE 11.09 1.36 16873 1.34 UNIV BASEL CH 11.02 1.55 8483 1.46 ERASMUS UNIV ROTTERDAM NL 11.01 1.49 12408 1.49 UNIV GENEVE CH 10.90 1.45 9342 1.47 UNIV COLL LONDON UK 10.85 1.48 26286 1.46 IMPERIAL COLL LONDON UK 10.47 1.55 21967 1.49 UNIV SUSSEX UK 10.33 1.60 3841 1.67 UNIV EDINBURGH UK 10.25 1.54 13188 1.54 UNIV ZURICH CH 10.16 1.46 13824 1.44 LEIDEN UNIV NL 10.02 1.43 12513 1.37 QUEEN MARY COLL UNIV LONDON UK 9.92 1.44 4586 1.45 UNIV HEIDELBERG DE 9.71 1.35 15445 1.32 STOCKHOLM UNIV SE 9.55 1.43 6427 1.50 VRIJE UNIV AMSTERDAM NL 9.51 1.40 12201 1.40 HEINRICH HEINE UNIV DUSSELDORF DE 9.50 1.29 6636 1.25 UNIV DURHAM UK 9.44 1.69 5848 1.65 ETH ZURICH CH 9.41 1.63 15099 1.64 UNIV GLASGOW UK 9.34 1.41 10435 1.45 KINGS COLL UNIV LONDON UK 9.33 1.38 13680 1.33 UNIV SOUTHERN DENMARK DK 9.30 1.29 4786 1.34 MED HOCHSCHULE HANNOVER DE 9.29 1.22 5233 1.16 LMU UNIV MUNCHEN DE 9.23 1.38 16995 1.30 UNIV AMSTERDAM NL 9.18 1.41 15492 1.36 UNIV BORDEAUX II VICTOR SEGALEN FR 9.16 1.24 4354 1.22
Large, Broad European University Focus: top 25 % in publication output and citation impact Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25%
Smaller Specialized European University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in Economy and related fields Among top 25 % in citation impact, but in the lower-50% of publication output ECON MATH PSYCH
Based on this updated and extended ranking approach: Benchmarking studies of universities Comparison of the ‘target’-university with 20 other universities at choice:
Application of Thomson-WoS Impact Factors for research performance evaluation is irresponsible * Much too short citation window * No field-specific normalization * No distinction between document types * Calculation errors/inconsistencies nominator /denominator * Underlying citation distribution is very skew: IF-value heavily determined by a few very highly cited papers
<ul><li>Basic Performance Indicators </li></ul><ul><li>P Ouput : number of publications in WoS-covered journals </li></ul><ul><li>C Absolute Impact : number of citations to these publications </li></ul><ul><li>H Hirsch-index </li></ul><ul><li>CPP Output-normalized impact : Average number of cits/pub of the institute/group </li></ul><ul><li>JCSm Average number of cits/pub of the journal set used by the institute/group </li></ul><ul><li>FCSm Average number of cits/pub of all journals of a specific field in which the institute/group is active </li></ul><ul><li>p0 Percentage of not-cited publications </li></ul>
<ul><li>CWTS Key Research Performance Indicators: </li></ul><ul><li>JCSm/FCSm Relative impact of the used journal set </li></ul><ul><li>CPP/JCSm Internat. journal-normalized impact </li></ul><ul><li>CPP/FCSm Internat. field & doc-normalized impact </li></ul><ul><li>MNCS Internat. field & doc-normalized impact </li></ul><ul><li>alternative normalization </li></ul><ul><li>Pt/ Π t Contribution to the top-5, 10, 20,..% </li></ul><ul><li>P*CPP/FCSm Size & Impact Together: Brute Force </li></ul>
CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole 1996 2005
CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%, half of which 0 cit B: 10% C: 15% D: 10% E: 19% 1996 2005
Publications from 1991,….1995 time lag & citation window
CPP/FCSm < 0.80: performance significantly below internat. average, class A; 0.80 < CPP/FCSm < 1.20: performance about internat. average, class B; 1.20 < CPP/FCSm < 2.00: performance significantly above internat. average, class C; 2.00 < CPP/FCSm < 3.00: performance in internat. perspective is very good, class D; CPP/FCSm > 3.00: performance in internat. perspective is excellent, class E.
Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now breakdown into its 5 departments world average
<ul><li>NL national chemistry evaluation: 10 Universities with major chemistry departments </li></ul><ul><li>One of these universities has 13 chemistry research groups: </li></ul><ul><li>Bioinformatics </li></ul><ul><li>Solid state NMR </li></ul><ul><li>Theoretical chemistry </li></ul><ul><li>NMR studies of proteins </li></ul><ul><li>Supramolecular chemistry </li></ul><ul><li>Synthesis of antitumor compounds </li></ul><ul><li>Synthetic organic chemistry </li></ul><ul><li>Bio-organic chemistry </li></ul><ul><li>Organometallic chemistry </li></ul><ul><li>Chemometrics </li></ul><ul><li>Crystal growth </li></ul><ul><li>Autoimmune biochemistry </li></ul><ul><li>Heat shock proteins </li></ul>Breakdown by fields Chemistry at research group level Field (CPP/FCSm)
cluster Field = set of publications with thematic/field-specific classification codes again for new, emerging often interdisc. fields scientific fine-grained structure
Mesh delineation vs. journal-classification Problem of the ‘right’ FCSm….. FCSm FCSm ISI j-category based PubMed classification based
Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations. Quite remarkably, lower performance universities have a larger size-dependent cumulative advantage for receiving citations than top-performance universities.
Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size. The higher the average journal impact of a university, the lower the number of not-cited publications. Also, the higher the average number of citations per publication in a university, the lower the number of not-cited publications. In other words, universities that are cited more per paper also have more cited papers.
Finding 3: The average research performance of university measured by crown indicator CPP/FCSm does not ‘dilute’ with increasing size. The large top-performance universities are characterized by ‘big and beautiful’. They succeed in keeping a high performance over a broad range of activities. This is an indication of their overall scientific and intellectual attractive power.
Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for the total number of citations. For lower-performance universities, field citation density provides a cumulative advantage in citations per publication. Top universities publish in journals with higher journal impact as compared to lower performance universities. Top universities perform a factor 1.3 better than bottom universities in journals with the same average impact.
Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation density, and of average journal impact.
<ul><li>Some conclusions: </li></ul><ul><li>Different research rankings by indicator possible, but impact remains the most important </li></ul><ul><li>Different research rankings by field possible, but strong universities are strong in most of the fields (cumulative advantage by reputation) </li></ul><ul><li>Nevertheless, universities with not that high average impact may be very strong in one or a few fields: focus, ambitions </li></ul>
<ul><li>Time-dependent analysis is important for monitoring strengths (and/or weaknesses…) </li></ul><ul><li>Unexpected strengths may show up! </li></ul><ul><li>International collaboration is very important in reinforcing a university’s impact, this clearly underlines the importance of networks </li></ul><ul><li>A relatively large group of European universities with good overall quality and several centers of excellence: city and region give universities new opportunities (and vice versa) </li></ul>