Responsible use of university rankings
Ludo Waltman
Centre for Science and Technology Studies, Leiden University
10th Annual International Symposium on University Rankings and Quality
Assurance 2018
Brussels, June 21, 2018
CWTS Leiden Ranking
• Focused on research, not on teaching
• Based purely on bibliometric indicators; no survey data
or data provided by universities
• High-quality bibliometric methodology
• No composite indicators
• Multiple views, not just a simple list
1
CWTS Leiden Ranking (2018 edition)
• All universities worldwide with ≥1000 Web of Science
publications in period 2013–2016
• 938 universities from 55 countries
2
3
4
5
6
7Source: www.cwts.nl/blog?article=n-r2q274
8
Source: www.researchresearch.com/news/article/?articleId=1368350
Design of rankings
1. A generic concept of university performance should not
be used
2. A clear distinction should be made between size-
dependent and size-independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
9
Do not use a generic concept of
university performance
10
Do not use a generic concept of
university performance
11
Composite
indicator
Do not use a generic concept of
university performance
12
Distinguish between size-dependent and
size-independent indicators
What are the wealthiest countries in the world?
13
GDP per capita GDP
Distinguish between size-dependent and
size-independent indicators
14
Distinguish between size-dependent and
size-independent indicators
15
Define universities consistently
• leiden univ
• leiden state univ
• state univ leiden
• leiden univ hosp
• state univ leiden hosp
• univ leiden hosp
• univ hosp leiden
• lumc
• univ leiden
• leiden univ med ctr
• leiden state univ hosp
• leiden observ
• sterrewacht leiden
• acad hosp leiden
• rijksuniv leiden
• rijksherbarium
• gorlaeus labs
• leiden inst brain & cognit
• leiden inst chem
• sylvius labs
• acad ziekenhuis leiden
• leiden cytol & pathol lab
• rijksherbarium hortus bot
• ...
16
University rankings should be sufficiently
transparent
17www.issi-society.org/open-citations-letter/
Need for consistency and transparency
18
Weak correlation between citation scores in Leiden Ranking and
Times Higher Education ranking
Interpretation of rankings
5. Comparisons between universities should be made
keeping in mind differences between universities
6. Uncertainty in university rankings should be
acknowledged
7. An exclusive focus on ranks of universities should be
avoided; values of underlying indicators should be taken
into account
19
Uncertainty in university rankings should
be acknowledged
20
Stephen Curry, the Guardian, October 6, 2015
Take into account values of indicators
21
Rockefeller University
Rank 1 (PPtop 10% = 28.2%)
Queen Mary University London
Rank 50 (PPtop 10% = 14.8%)
University of Bari Aldo Moro
Rank 550 (PPtop 10% = 8.0%)
Difference in PPtop 10% is two times larger for universities
at ranks 1 and 50 than for universities at ranks 50 and 500
Use of rankings
8. Dimensions of university performance not covered by
university rankings should not be overlooked
9. Performance criteria relevant at university level should
not automatically be assumed to have same relevance
at department of research group level
10.University rankings should be handled cautiously, but
they should not be dismissed as being completely
useless
22
Simplistic use of rankings
23
Proper use of rankings
24
Encouraging proper use
25
Conclusions
• Rankings provide valuable information...
• …but only when designed, interpreted, and used in a
proper manner
• Ranking producers, universities, governments, and news
media need to work toward shared principles for
responsible university ranking
• University rankings are here to stay…
• …but we can change the rules of the game
26
Thank you for your attention!
27

Responsible use of university rankings

  • 1.
    Responsible use ofuniversity rankings Ludo Waltman Centre for Science and Technology Studies, Leiden University 10th Annual International Symposium on University Rankings and Quality Assurance 2018 Brussels, June 21, 2018
  • 2.
    CWTS Leiden Ranking •Focused on research, not on teaching • Based purely on bibliometric indicators; no survey data or data provided by universities • High-quality bibliometric methodology • No composite indicators • Multiple views, not just a simple list 1
  • 3.
    CWTS Leiden Ranking(2018 edition) • All universities worldwide with ≥1000 Web of Science publications in period 2013–2016 • 938 universities from 55 countries 2
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
    Design of rankings 1.A generic concept of university performance should not be used 2. A clear distinction should be made between size- dependent and size-independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 9
  • 11.
    Do not usea generic concept of university performance 10
  • 12.
    Do not usea generic concept of university performance 11 Composite indicator
  • 13.
    Do not usea generic concept of university performance 12
  • 14.
    Distinguish between size-dependentand size-independent indicators What are the wealthiest countries in the world? 13 GDP per capita GDP
  • 15.
    Distinguish between size-dependentand size-independent indicators 14
  • 16.
    Distinguish between size-dependentand size-independent indicators 15
  • 17.
    Define universities consistently •leiden univ • leiden state univ • state univ leiden • leiden univ hosp • state univ leiden hosp • univ leiden hosp • univ hosp leiden • lumc • univ leiden • leiden univ med ctr • leiden state univ hosp • leiden observ • sterrewacht leiden • acad hosp leiden • rijksuniv leiden • rijksherbarium • gorlaeus labs • leiden inst brain & cognit • leiden inst chem • sylvius labs • acad ziekenhuis leiden • leiden cytol & pathol lab • rijksherbarium hortus bot • ... 16
  • 18.
    University rankings shouldbe sufficiently transparent 17www.issi-society.org/open-citations-letter/
  • 19.
    Need for consistencyand transparency 18 Weak correlation between citation scores in Leiden Ranking and Times Higher Education ranking
  • 20.
    Interpretation of rankings 5.Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 19
  • 21.
    Uncertainty in universityrankings should be acknowledged 20 Stephen Curry, the Guardian, October 6, 2015
  • 22.
    Take into accountvalues of indicators 21 Rockefeller University Rank 1 (PPtop 10% = 28.2%) Queen Mary University London Rank 50 (PPtop 10% = 14.8%) University of Bari Aldo Moro Rank 550 (PPtop 10% = 8.0%) Difference in PPtop 10% is two times larger for universities at ranks 1 and 50 than for universities at ranks 50 and 500
  • 23.
    Use of rankings 8.Dimensions of university performance not covered by university rankings should not be overlooked 9. Performance criteria relevant at university level should not automatically be assumed to have same relevance at department of research group level 10.University rankings should be handled cautiously, but they should not be dismissed as being completely useless 22
  • 24.
    Simplistic use ofrankings 23
  • 25.
    Proper use ofrankings 24
  • 26.
  • 27.
    Conclusions • Rankings providevaluable information... • …but only when designed, interpreted, and used in a proper manner • Ranking producers, universities, governments, and news media need to work toward shared principles for responsible university ranking • University rankings are here to stay… • …but we can change the rules of the game 26
  • 28.
    Thank you foryour attention! 27