How to design a ranking system: Criteria and opportunities for a comparison
1. How to design a ranking system:
Criteria and opportunities for a comparison
Nees Jan van Eck
Centre for Science and Technology Studies (CWTS), Leiden University
Workshop: “University rankings: A challenge of a foe?”
University of Milan, Milan, Italy, May 29, 2017
4. CWTS Leiden Ranking
• Provides bibliometric indicators of:
– Scientific impact
– Scientific collaboration
• Calculated based on Clarivate Analytics Web of
Science data
3
5. Selection of universities
• All universities worldwide with ≥1000 Web of
Science publications in period 2012–2015
• 902 universities from 54 countries
4
6. Differences with other university
rankings
• No composite indicators
• Exclusive focus on scientific performance
• No survey data or data provided by universities
• Advanced bibliometric methodology
5
15. Responsible use of university
rankings
• Ten principles for responsible use of rankings:
– Design of rankings
– Interpretation of rankings
– Use of rankings
• Covers university rankings in general, not only the
Leiden Ranking
• Partly builds on Leiden Manifesto
14
16. Design of rankings
1. A generic concept of university performance
should not be used
2. A clear distinction should be made between size-
dependent and size-independent indicators
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently
transparent
15
17. Do not use a generic concept of
university performance
16
Composite
indicator
18. Interpretation of rankings
5. Comparisons between universities should be made
keeping in mind differences between universities
6. Uncertainty in university rankings should be
acknowledged
7. An exclusive focus on ranks of universities should
be avoided; values of underlying indicators should
be taken into account
17
19. Take into account values of indicators
18
Rockefeller University
Rank 1 (PPtop 10% = 28.2%)
Queen Mary University London
Rank 50 (PPtop 10% = 14.8%)
University of Bari Aldo Moro
Rank 550 (PPtop 10% = 8.0%)
Difference in PPtop 10% is two times larger for universities
at ranks 1 and 50 than for universities at ranks 50 and 500
20. Use of rankings
8. Dimensions of university performance not covered
by university rankings should not be overlooked
9. Performance criteria relevant at university level
should not automatically be assumed to have same
relevance at department or research group level
10.University rankings should be handled cautiously,
but they should not be dismissed as being
completely useless
19
23. More information
• www.leidenranking.com
– Leiden Ranking 2017 statistics
– Methodological information
– Contact form
• www.cwts.nl/blog
– Principles for responsible use of university rankings
22