Rankings,
Universities and the
Media
RICHARD HOLMES
BOLOGNA 10 MAY 2019
Some observations
 Rankings have proliferated since 2003
 17 global rankings in the IREG Inventory
 Plus business, subject, regional etc
 Already out of date
 Largely ignored in 2003, now intense public concern
Ranking Realities: the ranking
community
 Diverse and complex
 IREG International Inventory has 45 rankings
 Since publication new rankings from Russia, Kazakhstan,
Iran
Ranking realities; the media
THE
QS
 Shanghai Rankings
 US News Best Global Universities
 CWUR, Nature Index, Webometrics, Nature Index, Reuters, Emerging/Trendence, Round Rankings, National Taiwan University, URAP,
Scimago etc etc etc
Media coverage highly selective
 Compare the THE world rankings and Round University
Rankings
 Similar methodology
 But RUR includes more indicators
 RUR has sensible weighting
 Compare Babol Noshirvani University of Iran in the two
rankings
 THE top 400, 1st Iran, 1st for normalised citations
 RUR 615, 7th in Iran, 59th for citations
continued
 BUT compare the public response to the rankings
 Google search
 THE – over 1 million plus
 RUR – 62,000
 THE reported in the national press, praised by university
heads
 RUR covered in Russia, Central Asia and Eastern Europe
Rankings hierarchy
 1. THE
 2. QS
 3. Shanghai
 4. US News Best Global Universities
 5. Research based rankings by universities or research
centres
 6. Web based rankings
continued
 Yesterday’s sessions – Raghu and Cesar referred to THE
and QS
 Exceptions – URAP in Malaysia, Webometrics in Africa
 Does this hierarchy have anything to do with quality of
any kind?
 Reliability, validity, consistency, transparency?
 Or does it reflect the interests of the current elites of
higher education?
Collusion between media,
universities and rankers
 Errors and defects of rankings are (mostly) quietly
ignored
 We don’t see (much) criticism of THE for bizarre citation
results, QS for emphasis on surveys
 Rankers provide useful data, but also headlines for the
public, scapegoats and heroes for the media
continued
 Narratives for universities – doomsday is coming unless
we get more money and more international students,
British universities are in danger because of Brexit
Forget about rankings?
 Can we just forget about rankings?
 We would just revert to the pre-Rankings days and have
informal rankings
Criticism of rankings
 From periphery, blogs and social media
 Often anonymous
 Less well known institutions
 Technical criticism in journal articles
 University leaders -- We are unique, rankings do not
measure what matters
 What is missing is criticism that is not anti-ranking in
principle and is comprehensible to specialists
What should be done?
 More rankings by universities
 More informed and independent criticism of rankings
 Separate ranking, auditing, journalism, advertising

Rankings, Universities and the Media, IREG

  • 1.
  • 2.
    Some observations  Rankingshave proliferated since 2003  17 global rankings in the IREG Inventory  Plus business, subject, regional etc  Already out of date  Largely ignored in 2003, now intense public concern
  • 3.
    Ranking Realities: theranking community  Diverse and complex  IREG International Inventory has 45 rankings  Since publication new rankings from Russia, Kazakhstan, Iran
  • 4.
    Ranking realities; themedia THE QS  Shanghai Rankings  US News Best Global Universities  CWUR, Nature Index, Webometrics, Nature Index, Reuters, Emerging/Trendence, Round Rankings, National Taiwan University, URAP, Scimago etc etc etc
  • 5.
    Media coverage highlyselective  Compare the THE world rankings and Round University Rankings  Similar methodology  But RUR includes more indicators  RUR has sensible weighting  Compare Babol Noshirvani University of Iran in the two rankings  THE top 400, 1st Iran, 1st for normalised citations  RUR 615, 7th in Iran, 59th for citations
  • 6.
    continued  BUT comparethe public response to the rankings  Google search  THE – over 1 million plus  RUR – 62,000  THE reported in the national press, praised by university heads  RUR covered in Russia, Central Asia and Eastern Europe
  • 7.
    Rankings hierarchy  1.THE  2. QS  3. Shanghai  4. US News Best Global Universities  5. Research based rankings by universities or research centres  6. Web based rankings
  • 8.
    continued  Yesterday’s sessions– Raghu and Cesar referred to THE and QS  Exceptions – URAP in Malaysia, Webometrics in Africa  Does this hierarchy have anything to do with quality of any kind?  Reliability, validity, consistency, transparency?  Or does it reflect the interests of the current elites of higher education?
  • 9.
    Collusion between media, universitiesand rankers  Errors and defects of rankings are (mostly) quietly ignored  We don’t see (much) criticism of THE for bizarre citation results, QS for emphasis on surveys  Rankers provide useful data, but also headlines for the public, scapegoats and heroes for the media
  • 10.
    continued  Narratives foruniversities – doomsday is coming unless we get more money and more international students, British universities are in danger because of Brexit
  • 11.
    Forget about rankings? Can we just forget about rankings?  We would just revert to the pre-Rankings days and have informal rankings
  • 12.
    Criticism of rankings From periphery, blogs and social media  Often anonymous  Less well known institutions  Technical criticism in journal articles  University leaders -- We are unique, rankings do not measure what matters  What is missing is criticism that is not anti-ranking in principle and is comprehensible to specialists
  • 13.
    What should bedone?  More rankings by universities  More informed and independent criticism of rankings  Separate ranking, auditing, journalism, advertising