Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

How to design a ranking system: Criteria and opportunities for a comparison

687 views

Published on

Presentation at the workshop "University rankings: A challenge of a foe?" organized by the University of Milan, Milan, Italy, May 29, 2017.

Published in: Science
  • Be the first to comment

  • Be the first to like this

How to design a ranking system: Criteria and opportunities for a comparison

  1. 1. How to design a ranking system: Criteria and opportunities for a comparison Nees Jan van Eck Centre for Science and Technology Studies (CWTS), Leiden University Workshop: “University rankings: A challenge of a foe?” University of Milan, Milan, Italy, May 29, 2017
  2. 2. Outline • CWTS Leiden Ranking • Responsible use of university rankings 1
  3. 3. CWTS Leiden Ranking 2
  4. 4. CWTS Leiden Ranking • Provides bibliometric indicators of: – Scientific impact – Scientific collaboration • Calculated based on Clarivate Analytics Web of Science data 3
  5. 5. Selection of universities • All universities worldwide with ≥1000 Web of Science publications in period 2012–2015 • 902 universities from 54 countries 4
  6. 6. Differences with other university rankings • No composite indicators • Exclusive focus on scientific performance • No survey data or data provided by universities • Advanced bibliometric methodology 5
  7. 7. 6
  8. 8. 7
  9. 9. 8
  10. 10. 9
  11. 11. 10
  12. 12. 11
  13. 13. 12
  14. 14. Responsible use of university rankings 13
  15. 15. Responsible use of university rankings • Ten principles for responsible use of rankings: – Design of rankings – Interpretation of rankings – Use of rankings • Covers university rankings in general, not only the Leiden Ranking • Partly builds on Leiden Manifesto 14
  16. 16. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size- dependent and size-independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 15
  17. 17. Do not use a generic concept of university performance 16 Composite indicator
  18. 18. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 17
  19. 19. Take into account values of indicators 18 Rockefeller University Rank 1 (PPtop 10% = 28.2%) Queen Mary University London Rank 50 (PPtop 10% = 14.8%) University of Bari Aldo Moro Rank 550 (PPtop 10% = 8.0%) Difference in PPtop 10% is two times larger for universities at ranks 1 and 50 than for universities at ranks 50 and 500
  20. 20. Use of rankings 8. Dimensions of university performance not covered by university rankings should not be overlooked 9. Performance criteria relevant at university level should not automatically be assumed to have same relevance at department or research group level 10.University rankings should be handled cautiously, but they should not be dismissed as being completely useless 19
  21. 21. Use of rankings for promotion 20
  22. 22. 21
  23. 23. More information • www.leidenranking.com – Leiden Ranking 2017 statistics – Methodological information – Contact form • www.cwts.nl/blog – Principles for responsible use of university rankings 22
  24. 24. Thank you for your attention! 23

×