Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

A scientometric perspective on university ranking

7 views

Published on

Presentation at the Center of Scientometrics, National Science Library, Chinese Academy of Sciences, Beijing, China, April 12, 2019.

Published in: Education
  • Be the first to comment

  • Be the first to like this

A scientometric perspective on university ranking

  1. 1. A scientometric perspective on university ranking Nees Jan van Eck Centre for Science and Technology Studies (CWTS), Leiden University Center of Scientometrics (CoS), National Science Library, Chinese Academy of Sciences Beijing, China, April 12, 2019
  2. 2. Centre for Science and Technology Studies (CWTS) • Research center at Leiden University • Science and technology studies, with a considerable emphasis on scientometrics • About 50 staff members • Our mission: Making the science system better! • Research groups: – Quantitative Science Studies – Science and Evaluation Studies – Science, Technology, and Innovation Studies • Commissioned research for research institutions, funders, governments, companies, etc. 1
  3. 3. About myself • Master in computer science • PhD thesis on bibliometric mapping of science • Senior researcher at CWTS – Bibliometric network analysis and visualization – Bibliometric data sources – Bibliometric indicators • Head of ICT 2
  4. 4. VOSviewer 3
  5. 5. Outline • CWTS Leiden Ranking • Responsible use of university rankings 4
  6. 6. CWTS Leiden Ranking 5
  7. 7. CWTS Leiden Ranking 6
  8. 8. CWTS Leiden Ranking • Provides bibliometric indicators of: – Scientific impact – Scientific collaboration • Calculated based on Clarivate Analytics Web of Science data 7
  9. 9. Selection of universities (2018 edition) • All universities worldwide with ≥1000 Web of Science publications in period 2013–2016 • 938 universities from 55 countries 8
  10. 10. Indicators • Size-dependent and size-independent indicators • Scientific output: – P 9 • Scientific impact: – P(top 1%) and PP(top 1%) – P(top 5%) and PP(top 5%) – P(top 10%) and PP(top 10%) – P(top 50%) and PP(top 50%) – TCS and MCS – TNCS and MNCS • Scientific collaboration: – P(collab) and PP(collab) – P(int collab) and PP(int collab) – P(industry) and PP(industry) – P(<100 km) and PP(<100 km) – P(>5000 km) and PP(>5000 km) PP X = P(X) P
  11. 11. Differences with other university rankings • No composite indicators • Focused on research, not on teaching • Based purely on bibliometric indicators; no survey data or data provided by universities • High-quality bibliometric methodology • Multiple views, not just a simple list 10
  12. 12. 11
  13. 13. 12
  14. 14. 13
  15. 15. 14
  16. 16. 15
  17. 17. 16
  18. 18. 17
  19. 19. Advanced bibliometric methodology • Field classification system • Counting citations vs. counting highly cited publications • Full counting vs. fractional counting • Bibliographic database 18
  20. 20. 19 About 4000 fields of science in the Leiden Ranking Social sciences and humanities Biomedical and health sciences Life and earth sciences Mathematics and computer science Physical sciences and engineering
  21. 21. Why count highly cited publications? • Leiden Ranking counts number of highly cited publications (top 10%) • THE, QS, and US News count number of citations • Effect of counting number of citations: 20
  22. 22. Why count highly cited publications? 21
  23. 23. Why count highly cited publications? 22 Counting citations Counting highly cited publications Leaving out Göttingen’s most cited publication
  24. 24. How to handle publications co-authored by multiple institutions? • THE, QS, and US News: – Co-authored publications are fully assigned to each co-authoring institution (full counting) • Leiden Ranking: – Co-authored publications are fractionally assigned to each co-authoring institution (fractional counting) 23 This publication is assigned to Leiden with a weight of 2/4
  25. 25. Why use fractional counting? • Full counting is biased in favor of universities with a strong biomedical focus 24
  26. 26. Choice of bibliographic database: Is more data always better? • Universities from China, Russia, France, Germany, etc. may not benefit at all from having more data • Indicators should be based on a restricted database of publications • Leiden Ranking uses Web of Science, but excludes national scientific journals, trade journals, and popular magazines 25
  27. 27. Responsible use of university rankings 26
  28. 28. Responsible use of university rankings • Ten principles for responsible use of rankings: – Design of rankings – Interpretation of rankings – Use of rankings • Covers university rankings in general, not only the Leiden Ranking 27 Source: www.cwts.nl/blog?article=n-r2q274
  29. 29. Responsible use of university rankings 28 Source: www.researchresearch.com/news/article/?articleId=1368350 Source: https://vimeo.com/279712695
  30. 30. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 29
  31. 31. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 30
  32. 32. Do not use a generic concept of university performance 31 Composite indicator
  33. 33. Do not use a generic concept of university performance 32
  34. 34. Do not use a generic concept of university performance 33
  35. 35. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 34
  36. 36. Distinguish between size-dependent and size- independent indicators What are the wealthiest countries in the world? 35 GDP per capita GDP
  37. 37. Distinguish between size-dependent and size- independent indicators • Shanghai, THE, QS, and US News use composite indicators • These composite indicators combine size-dependent and size-independent indicators • It is unclear which concept of scientific performance is measured 36
  38. 38. Distinguish between size-dependent and size- independent indicators 37
  39. 39. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 38
  40. 40. Define universities consistently • leiden univ • leiden state univ • state univ leiden • leiden univ hosp • state univ leiden hosp • univ leiden hosp • univ hosp leiden • lumc • univ leiden • leiden univ med ctr • leiden state univ hosp • leiden observ • sterrewacht leiden • acad hosp leiden • rijksuniv leiden • rijksherbarium • gorlaeus labs • leiden inst brain & cognit • leiden inst chem • sylvius labs • acad ziekenhuis leiden • leiden cytol & pathol lab • rijksherbarium hortus bot • ... 39
  41. 41. Design of rankings 1. A generic concept of university performance should not be used 2. A clear distinction should be made between size-dependent and size- independent indicators 3. Universities should be defined in a consistent way 4. University rankings should be sufficiently transparent 40
  42. 42. University rankings should be sufficiently transparent 41www.issi-society.org/open-citations-letter/
  43. 43. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 43
  44. 44. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 44
  45. 45. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 45
  46. 46. Interpretation of rankings 5. Comparisons between universities should be made keeping in mind differences between universities 6. Uncertainty in university rankings should be acknowledged 7. An exclusive focus on ranks of universities should be avoided; values of underlying indicators should be taken into account 47
  47. 47. Take into account values of indicators 48 Rockefeller University Rank 1 (PPtop 10% = 28.2%) Queen Mary University London Rank 50 (PPtop 10% = 14.8%) University of Bari Aldo Moro Rank 550 (PPtop 10% = 8.0%) Difference in PP(top 10%) is two times larger for universities at ranks 1 and 50 than for universities at ranks 50 and 500
  48. 48. Use of rankings 8. Dimensions of university performance not covered by university rankings should not be overlooked 9. Performance criteria relevant at university level should not automatically be assumed to have same relevance at department of research group level 10.University rankings should be handled cautiously, but they should not be dismissed as being completely useless 49
  49. 49. Simplistic use of rankings 50
  50. 50. Proper use of rankings 51
  51. 51. Encouraging proper use 52
  52. 52. Conclusions • Rankings provide valuable information... • …but only when designed, interpreted, and used in a proper manner • Ranking producers, universities, governments, and news media need to work toward shared principles for responsible university ranking 53
  53. 53. CWTS Leiden Ranking 2019 • Two new types of indicators, in addition to impact and collaboration indicators • Open access indicators (based on Unpaywall data) – P(OA) – P(gold) – P(green) – P(unknown) • Gender indicators – A(male) – A(female) – A(unknown) 54
  54. 54. Thank you for your attention! 55

×