GLOBAL UNIVERSITY
RANKINGS: CHANGING
METHODOLOGIES
Richard Holmes
University Ranking Watch
Ekaterinburg, Russian Federation
INTRODUCTION
 All major global rankings, to varying degrees, have
introduced changes in their methodology
Changes include
 Source of data
 Calculation of indicator scores
 Measures to reduce fluctuation
 Measures to eliminate outliers and anomalies
REVIEW OF CHANGES: QS
 2004 Times Higher Education Supplement (THES) –
Quacquarelli Symonds World University Rankings
 2005 added employer survey with a 10% weighting
 A cluster of changes in 2007 – Full time equivalent
instead of head counts, z-scores for each indicator, no
voting for own university in the survey, change to
data from Scopus.
QS CONTINUED
 Continued development of academic and employer
survey and procedures against manipulation
QS CONTINUED
Second cluster of changes in 2015
 normalisation – five subject groups treated as
contributing equally to citations indicator
 Roll-over period for survey responses increased from
3 to 5 years
 Papers with “freakish” numbers of authors not
counted for citations
EFFECTS OF QS 2015 CHANGES
 Surveys more stable
 Universities strong in physics and medicine suffered
while those strong in social sciences and engineering
benefitted
THE CHANGES 2011
 International research collaboration added
 International students and international faculty given
equal weighting
 Public research income as % of research income
dropped
 Reduced weighting for citations, counted for 6 years
 Field normalization applied to a further 3 research
indicators
CONTINUED
 The main result of these changes was to help
universities with strengths in the social sciences
 Also reduced the anomaly of Alexandria University as
4th best university in the world for research impact
THE CHANGES 2015
 Switch from Thomson Reuters to Scopus as a source
of data
 Expansion of academic survey
 Regional modification applied to half of citations
indicator
 Excluding papers with many authors from citation
counts
CONTINUED
 Universities that participated in multi-contributor
physics projects and benefitted from the regional
modification now suffered
 Including universities in France, Korea, Japan and
Turkey
 Oxford and Cambridge overtook Harvard
 Small and specialized institutions did well
OTHER MAJOR RANKINGS:
SHANGHAI ARWU
 2004 helped social science institutions by not
counting Nature and Science publications, and giving
a double score to social science publications
 2013 highly cited researchers with secondary
affiliations based on estimates from researchers
 2014-15 used old and new lists – secondary
affiliations not counted
US NEWS BEST GLOBAL
UNIVERSITIES
 2015 introduced new indicators – books and
indicators
QUESTIONS FOR QS
 Do you think that not counting multi-contributor
papers for citations is a good idea? Have you
considered fractional counting?
 Are you satisfied with the moderate degree of field
normalization introduced in 2015?
 Do you consider a 40% weighting for the academic
survey acceptable?
 And … do you plan on any changes?
QUESTIONS FOR THE
 Do you think that not counting multi-contributor
papers for citations is a good idea? Have you
considered fractional counting?
 Do you consider 30% an acceptable weighting for the
citations indicator?
 Can you explain how Oxford and Cambridge have
surpassed Harvard and MIT?
 And … do you plan on any more changes?

Global University Rankings: Changing Methodologies

  • 1.
    GLOBAL UNIVERSITY RANKINGS: CHANGING METHODOLOGIES RichardHolmes University Ranking Watch Ekaterinburg, Russian Federation
  • 2.
    INTRODUCTION  All majorglobal rankings, to varying degrees, have introduced changes in their methodology Changes include  Source of data  Calculation of indicator scores  Measures to reduce fluctuation  Measures to eliminate outliers and anomalies
  • 3.
    REVIEW OF CHANGES:QS  2004 Times Higher Education Supplement (THES) – Quacquarelli Symonds World University Rankings  2005 added employer survey with a 10% weighting  A cluster of changes in 2007 – Full time equivalent instead of head counts, z-scores for each indicator, no voting for own university in the survey, change to data from Scopus.
  • 4.
    QS CONTINUED  Continueddevelopment of academic and employer survey and procedures against manipulation
  • 5.
    QS CONTINUED Second clusterof changes in 2015  normalisation – five subject groups treated as contributing equally to citations indicator  Roll-over period for survey responses increased from 3 to 5 years  Papers with “freakish” numbers of authors not counted for citations
  • 6.
    EFFECTS OF QS2015 CHANGES  Surveys more stable  Universities strong in physics and medicine suffered while those strong in social sciences and engineering benefitted
  • 7.
    THE CHANGES 2011 International research collaboration added  International students and international faculty given equal weighting  Public research income as % of research income dropped  Reduced weighting for citations, counted for 6 years  Field normalization applied to a further 3 research indicators
  • 8.
    CONTINUED  The mainresult of these changes was to help universities with strengths in the social sciences  Also reduced the anomaly of Alexandria University as 4th best university in the world for research impact
  • 9.
    THE CHANGES 2015 Switch from Thomson Reuters to Scopus as a source of data  Expansion of academic survey  Regional modification applied to half of citations indicator  Excluding papers with many authors from citation counts
  • 10.
    CONTINUED  Universities thatparticipated in multi-contributor physics projects and benefitted from the regional modification now suffered  Including universities in France, Korea, Japan and Turkey  Oxford and Cambridge overtook Harvard  Small and specialized institutions did well
  • 11.
    OTHER MAJOR RANKINGS: SHANGHAIARWU  2004 helped social science institutions by not counting Nature and Science publications, and giving a double score to social science publications  2013 highly cited researchers with secondary affiliations based on estimates from researchers  2014-15 used old and new lists – secondary affiliations not counted
  • 12.
    US NEWS BESTGLOBAL UNIVERSITIES  2015 introduced new indicators – books and indicators
  • 13.
    QUESTIONS FOR QS Do you think that not counting multi-contributor papers for citations is a good idea? Have you considered fractional counting?  Are you satisfied with the moderate degree of field normalization introduced in 2015?  Do you consider a 40% weighting for the academic survey acceptable?  And … do you plan on any changes?
  • 14.
    QUESTIONS FOR THE Do you think that not counting multi-contributor papers for citations is a good idea? Have you considered fractional counting?  Do you consider 30% an acceptable weighting for the citations indicator?  Can you explain how Oxford and Cambridge have surpassed Harvard and MIT?  And … do you plan on any more changes?