• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
LEIDEN UNIVERSITY 2010
 

LEIDEN UNIVERSITY 2010

on

  • 1,438 views

 

Statistics

Views

Total Views
1,438
Views on SlideShare
1,203
Embed Views
235

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 235

http://www.urapcenter.org 235

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    LEIDEN UNIVERSITY 2010 LEIDEN UNIVERSITY 2010 Presentation Transcript

    • Ranking of Universities: Methods, Limitations, Challenges   Middle East Technical University Ankara, December 3, 2010 Anthony F.J. van Raan Center for Science and Technology Studies (CWTS) Leiden University
    • Leiden University, oldest in the Netherlands, 1575, European League of Research Universities (LERU) Leiden, historic city (2th, 11 th C.), strong cultural (arts, painting) & scientific tradition one of the largest science parks in EU
    •  
    •  
    • First and major challenge: - Establish as good as possible the research performance in the recent past in quantifiable terms (objectivity, reliability); - If you succeed in quantifying performance, you can always make a ranking
    • Two methodological approaches: (1) ‘broader’ peer review > expert survey > reputation assessment (2) bibliometric indicators > performance and impact measurement On the department or research program level (1) and (2) are strongly correlated
    • Conceptual problems peer review/surveys : * Slow: ‘old’ reputation vs. performance at the research front now * Mix of opinions about research, training, teaching, political/social status
    • Conceptual problems bibliometric analysis : 1. Evidences of research performance such as prizes and awards, earlier peer review, invitations as key-note speakers, etc. are not covered However, in many cases these ‘non-bibliometric’ indicators correlate strongly with bibliometric indicators 2. Focus on journal articles: differences across disciplines
    • Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, economical and technological problems From industry From international top-groups These all f(t)!! > Sleeping Beauties
    • WoS/Scopus sub-universe journal articles only,>1,000,000p/y Total publ universe Natural/medical sciences >>> applied, social, humanities
    • WoS/Scopus sub-universe journal articles only,> 1,000,000p/y Refs > non-WoS Total publ universe non-WoS publ: Books Book chapters Conf. proc. Reports Non-WoS journals
    • Citing publications C(A) P(A) Cited publications C(f) P(f) C(A)/P(A)= CPP C(f)/P(f)= FCS Field-specific normalization C(A)/P(A) ------------ = CPP / FCSm C(f)/P(f) + doc. type normalization + no self-citations, also not in C(f)!
    • Ja1 Ja2 Ja3 Ja4 Jb1 Jb2 Jb3 Jb4 Jb5 Ja2 Ja1 Ja4 Ja3 Bibliogr.coupl.
    • EC j label 1 http://www.neesjanvaneck.nl/journalmap FCS JCS Economics journals 2008 Journal map, n=332 JCS/FCS
    • Economics journals 2008 Journal density map, n=332
    • Fa1 Fa2 Fa3 Fa4 Fb1 Fb2 Fb3 Fb4 Fb5 Fa2 Fa1 Fa4 Fa3 Bibliogr.coupl.
    • Enhance visibility of engineering, social sciences and humanities by proper publication-density normalization Size: World average Science as a structure of 200 related fields World map
    • Urban Studies 2006-2008 Concept map, n=700
    • Urban Studies 2006-2008 Concept density map n=700 More stable, more comprehensible
    • F(t1) F(t2) Knowledge dynamics described by a continuity equation describing the change of a quantity inside any region in terms of density and flow ?
    • University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > Detailed research performance analysis of a university by department ‘ top-down’ analysis: field-structure is imposed to university; > Broad overview analysis of a university by field
    •  
    • Indicators divided into 5 categories, with fraction of final ranking score Teaching: 0.30 Research: 0.30 Citations: 0.325 Industry income: 0.025 International: 0.05 Fractions are based on ‘expert enthusiasm’ and confidence in data’
    •  
    •  
    •  
    • Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation 0.40 Employer Reputation: 0.10 Stud/staff: 0.20 Citations: 0.20 International: 0.10
    •  
    •  
    • Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staff, N&S, PUB: 0.20 + 0.20 + 0.20 PCP (size normaliz.) 0.10 Citations only in HiCi (but citation per staff measure)!
    •  
    •  
    • Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring 2010: 2003-2008(P)/2009(C) Autumn 2010: 2003-2009 (P and C)
    • Main Features of Leiden Ranking 2010: * Best possible definition of universities by applying the CWTS address unification algorithm * New indicators next to the standard CWTS indicators, including university-industry collaboration (autumn) * Field-specific ranking and top- indicators (15 major fields) > Benchmarking
    • P=2004-2008, C=2004-2009           top-250 largest ranked by CPP                     CPP/     CPP MNCS2 P FCSm UNIV LAUSANNE CH 12.23 1.49 6950 1.55 UNIV DUNDEE UK 11.82 1.42 4413 1.51 LONDON SCH HYGIENE & TROPICAL MEDICINE UNIV LONDON UK 11.59 1.64 4232 1.62 JG UNIV MAINZ DE 11.41 1.32 5015 1.25 UNIV OXFORD UK 11.25 1.67 26539 1.63 UNIV CAMBRIDGE UK 11.15 1.70 25662 1.63 KAROLINSKA INST STOCKHOLM SE 11.09 1.36 16873 1.34 UNIV BASEL CH 11.02 1.55 8483 1.46 ERASMUS UNIV ROTTERDAM NL 11.01 1.49 12408 1.49 UNIV GENEVE CH 10.90 1.45 9342 1.47 UNIV COLL LONDON UK 10.85 1.48 26286 1.46 IMPERIAL COLL LONDON UK 10.47 1.55 21967 1.49 UNIV SUSSEX UK 10.33 1.60 3841 1.67 UNIV EDINBURGH UK 10.25 1.54 13188 1.54 UNIV ZURICH CH 10.16 1.46 13824 1.44 LEIDEN UNIV NL 10.02 1.43 12513 1.37 QUEEN MARY COLL UNIV LONDON UK 9.92 1.44 4586 1.45 UNIV HEIDELBERG DE 9.71 1.35 15445 1.32 STOCKHOLM UNIV SE 9.55 1.43 6427 1.50 VRIJE UNIV AMSTERDAM NL 9.51 1.40 12201 1.40 HEINRICH HEINE UNIV DUSSELDORF DE 9.50 1.29 6636 1.25 UNIV DURHAM UK 9.44 1.69 5848 1.65 ETH ZURICH CH 9.41 1.63 15099 1.64 UNIV GLASGOW UK 9.34 1.41 10435 1.45 KINGS COLL UNIV LONDON UK 9.33 1.38 13680 1.33 UNIV SOUTHERN DENMARK DK 9.30 1.29 4786 1.34 MED HOCHSCHULE HANNOVER DE 9.29 1.22 5233 1.16 LMU UNIV MUNCHEN DE 9.23 1.38 16995 1.30 UNIV AMSTERDAM NL 9.18 1.41 15492 1.36 UNIV BORDEAUX II VICTOR SEGALEN FR 9.16 1.24 4354 1.22
    • Large, Broad European University Focus: top 25 % in publication output and citation impact Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25%
    • Smaller Specialized European University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in Economy and related fields Among top 25 % in citation impact, but in the lower-50% of publication output ECON MATH PSYCH
    • Based on this updated and extended ranking approach: Benchmarking studies of universities Comparison of the ‘target’-university with 20 other universities at choice:
    •  
    •  
    •  
    • Current and recent benchmark projects Manchester, Leiden, Heidelberg, Rotterdam, Copenhagen, Zürich, Lisbon UNL, Amsterdam UvA, Amsterdam VU, Southampton Gent, Antwerp, Brussels VUB, UC London, Aarhus
    • EU Multi-Ranking Universiteit Leiden
    • Leiden University
    • Delft University of Technology
    • METU Ankara
    • Turkey 2001-2004
    • Turkey 2005-2008
    • Thank you for your attention more information: www.cwts.leidenuniv.nl
    •  
    •  
    • E x ternal WoS coverage Example: Uppsala 2002-2006
    • More about rankings and benchmarking
    • Ranking by Top-10%
    •  
    •  
    • Aarhus University Size: University specific
    • Aalborg University
    •  
    •  
    • CPP/ FCSm CPP/ FCSm Diseases of the Neurosystem Dept Output in Fields Dept Impact from Fields Knowledge use by very high impact groups
    • CPP/FCSm < > MNCS1, 2
    • Figure 1 Relation between CPP/FCSm and MNCS 1 for 158 research groups Figure 2 Relation between CPP/FCSm and MNCS 2 indicator for 158 research groups
    • Figure 3 Relation between CPP/FCSm and MNCS 1 for the 365 largest universities in the world Figure 4. Relation between CPP/FCSm and MNCS 2 for the 365 largest universities in the world
    • Figure 5 Relation between CPP/FCSm and MNCS 1 for the 58 largest countries Figure 6 Relation between CPP/FCSm and MNCS 2 for the 58 largest countries
    • Figure 7 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 8 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
    • Only journals with CPP/FCSm and MNCS1 < 2.5 Figure 9 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 10 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
    • Figure 11 Relation between CPP/FCSm and MNCS 1 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 6 ). Figure 12 Relation between CPP/FCSm and MNCS 2 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 8 ). 0 1 2 3 4 5 0 1 2 3 4 5 A 0 1 2 3 4 5 0 1 2 3 4 5
    • Performance assessment
    • Application of Thomson-WoS Impact Factors for research performance evaluation is irresponsible * Much too short citation window * No field-specific normalization * No distinction between document types * Calculation errors/inconsistencies nominator /denominator * Underlying citation distribution is very skew: IF-value heavily determined by a few very highly cited papers
      • Basic Performance Indicators
      • P Ouput : number of publications in WoS-covered journals
      • C Absolute Impact : number of citations to these publications
      • H Hirsch-index
      • CPP Output-normalized impact : Average number of cits/pub of the institute/group
      • JCSm Average number of cits/pub of the journal set used by the institute/group
      • FCSm Average number of cits/pub of all journals of a specific field in which the institute/group is active
      • p0 Percentage of not-cited publications
      • CWTS Key Research Performance Indicators:
      • JCSm/FCSm Relative impact of the used journal set
      • CPP/JCSm Internat. journal-normalized impact
      • CPP/FCSm Internat. field & doc-normalized impact
      • MNCS Internat. field & doc-normalized impact
      • alternative normalization
      • Pt/ Π t Contribution to the top-5, 10, 20,..%
      • P*CPP/FCSm Size & Impact Together: Brute Force
    •  
    •  
    •  
    •  
    • CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole 1996 2005
    • CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%, half of which 0 cit B: 10% C: 15% D: 10% E: 19% 1996 2005
    • Publications from 1991,….1995 time lag & citation window
    • CPP/FCSm < 0.80: performance significantly below internat. average, class A; 0.80 < CPP/FCSm < 1.20: performance about internat. average, class B; 1.20 < CPP/FCSm < 2.00: performance significantly above internat. average, class C; 2.00 < CPP/FCSm < 3.00: performance in internat. perspective is very good, class D; CPP/FCSm > 3.00: performance in internat. perspective is excellent, class E.
    • University departments fields
    • Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now breakdown into its 5 departments world average
      • NL national chemistry evaluation: 10 Universities with major chemistry departments
      • One of these universities has 13 chemistry research groups:
      • Bioinformatics
      • Solid state NMR
      • Theoretical chemistry
      • NMR studies of proteins
      • Supramolecular chemistry
      • Synthesis of antitumor compounds
      • Synthetic organic chemistry
      • Bio-organic chemistry
      • Organometallic chemistry
      • Chemometrics
      • Crystal growth
      • Autoimmune biochemistry
      • Heat shock proteins
      Breakdown by fields Chemistry at research group level Field (CPP/FCSm)
    • cluster Field = set of publications with thematic/field-specific classification codes again for new, emerging often interdisc. fields scientific fine-grained structure
    • Mesh delineation vs. journal-classification Problem of the ‘right’ FCSm….. FCSm FCSm ISI j-category based PubMed classification based
    •  
    •  
    •  
    • Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations. Quite remarkably, lower performance universities have a larger size-dependent cumulative advantage for receiving citations than top-performance universities.
    •  
    •  
    • Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size. The higher the average journal impact of a university, the lower the number of not-cited publications. Also, the higher the average number of citations per publication in a university, the lower the number of not-cited publications. In other words, universities that are cited more per paper also have more cited papers.
    •  
    • Finding 3: The average research performance of university measured by crown indicator CPP/FCSm does not ‘dilute’ with increasing size. The large top-performance universities are characterized by ‘big and beautiful’. They succeed in keeping a high performance over a broad range of activities. This is an indication of their overall scientific and intellectual attractive power.
    •  
    •  
    •  
    •  
    • Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for the total number of citations. For lower-performance universities, field citation density provides a cumulative advantage in citations per publication. Top universities publish in journals with higher journal impact as compared to lower performance universities. Top universities perform a factor 1.3 better than bottom universities in journals with the same average impact.
    •  
    • Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation density, and of average journal impact.
      • Some conclusions:
      • Different research rankings by indicator possible, but impact remains the most important
      • Different research rankings by field possible, but strong universities are strong in most of the fields (cumulative advantage by reputation)
      • Nevertheless, universities with not that high average impact may be very strong in one or a few fields: focus, ambitions
      • Time-dependent analysis is important for monitoring strengths (and/or weaknesses…)
      • Unexpected strengths may show up!
      • International collaboration is very important in reinforcing a university’s impact, this clearly underlines the importance of networks
      • A relatively large group of European universities with good overall quality and several centers of excellence: city and region give universities new opportunities (and vice versa)