2. THEMES
• University Rankings and internationalisation
• Rankings and Southeast Asia: harmonisation, competition,
diversity and inclusion
• Rankings in Southeast Asia: potential and actuality
• Malaysia and Singapore
• Proposals
3. RANKINGS: A VERY SHORT
HISTORY
• Global rankings began modestly
• 2003 Shanghai rankings -- 500 universities, 6 indicators,
simple transparent methodology
• Little public impact but a shock to many European
universities
• 2004 Webometrics
• 2004 THES-QS –big news in Malaysia
4. CONTINUED
• Since then rankings have proliferated – IREG inventory has
17 global plus 4 sub-rankings, 5 subject, 2 Asian regional, 8
business school and 2 system rankings
• Influential – shaping university and government policy –
immigration, funding, appointment, promotion, partnership
5. CONTINUED
• More sophisticated – more indicators, standardisation,
normalisation, removing anomalies, more data points
• Moving beyond research and teaching towards third
missions
• But big gap – no serious measure of teaching and learning
or student/graduate quality
6. INTERNATIONALISATION
• Internationalisation means students, faculty, money, ideas
moving across borders
• Problematic for SE Asia with diverse cultures, languages,
histories, systems
• 2014 Morshidi Sirat, Norzaini Azman, Aishah Abu Bakar
wrote of the need for harmonisation and a regional higher
education space
• Can rankings contribute to these processes?
7. THE POTENTIAL OF
RANKINGS
• Rankings include data that is relevant to internationalisation
• Also data that recognises diversity of size and mission
• Innovation, faculty staff ratio, publications, citations,
international students, faculty, collaboration, student
exchange, web activity and popularity, income, staff with
PhDs, number of students, sustainability
8. CONTINUED
• Rankings have created guidelines about definitions, eg
international and FTE, noted by MIT
• Procedures for limiting fluctuations and reducing outliers
• BUT the full potential has not been realised
• Only a few rankings are well known
• Indicator ranks are ignored
• Some include only a few Southeast Asian universities
9. RESPONSES TO RANKINGS
• Some Southeast Asian countries have ignored rankings
• Others have used them selectively – Singapore THE and QS,
Malaysia, Thailand, Indonesia QS
• Indonesia has developed its own Green ranking
10. PERFORMANCE IN
RANKINGS
• Singapore (NUS) does extremely well in THE and QS -- less
well in others but still top 100
• Malaysia – Universiti Malaya top 100 in QS but top 300 or
400 in other rankings
• Myanmar and Cambodia ranked only in Webometrics and
uniRank
• Others – a handful in QS, none in Shanghai rankings
11. MALAYSIA AND
SINGAPORE
• Are interested in rankings, perhaps “obsessed”
• Singapore emphasises THE and QS rankings
• Malaysia –targets placing in QS – 4 in top 200, one in top 25
• After some false starts, have risen
• BUT there is some opposition
12. CONTINUED
• Malaysia does much better in QS than other rankings
• Universiti Malaya -- 87 in QS world rankings
• 301-350 in THE
• 213 Round University Rankings (general)
• 335 Scimago (research, innovation, social)
• 374 Webometrics (web, research)
13. CONTINUED
• QS is unbalanced – 40% for academic reputation (i.e
research)
• Malaysia is over represented in this survey – more
respondents than China and India combined
• Not sustainable – respondents from Russia, Japan,
Kazakhstan, Iraq and Colombia increasing
• Vulnerable to QS methodological changes
• Metrics not included in QS – publications, patents, income
14. ALTERNATIVES TO
RANKING POLICY
• Penang institute 2017 suggested Malaysia obsessed with
rankings
• Should focus on local indicators
• Criticises QS and other big four rankings
• BUT using local ratings or rankings will be of limited value
internationally
15. OBSERVATIONS
• Increasing number of rankings
• Cover different metrics -- research, innovation, web activity,
teaching related data, third mission
• Big variations in scores and ranks
• Universities differ in quality, size, mission
16. CONTINUED
• Relying on one or two rankings deprives stakeholders of
useful information
• Focus on a single metric or a few can distort policy
• Vulnerable to methodological changes -- this year QS
changes in the collection of survey data
• THE – disaster for Turkey when mega papers were not
included in citation counts
17. PROPOSAL: REGIONAL
RANKING DATABASE
• A regional depository of information from rankings,
including ranks, scores, indicators
• Universities might be encouraged to link to it whenever they
advertise their place in a ranking
• Stakeholders could obtain a complete and rounded profile of
institutions
18. PROPOSAL: BENCHMARKING
AND TARGETS
• Using QS not a good idea -- unbalanced
• Or THE – citations indicator a big problem – UTAR in
Malaysia or Veltech in India
• THE bundles indicator scores – not transparent
• Big four rankings biased towards rich, old, big, English-
speaking universities with medical schools
• Using any single ranking also not a good idea
19. CONTINUED
• If any single ranking is used then it should be non-
commercial, technically expert, and have a stable
methodology
• For research -- Leiden Ranking?
• General -- Round University Ranking? (Russia) – 20
indicators with sensible weightings
• More appropriate, use several rankings
20. CONTINUED
• Could be a short list of rankings for different categories of
universities
• Mainly research
• Research and medical
• Research and teaching
• Mainly teaching
• Three or four rankings for each category
21. CONTINUED
• The rankings curve gets very steep at the top
• Going from 50 to 25 is much more difficult and expensive
than from 200 to 100
• Over ambitious targets can produce serious distortions
• Targets and benchmarking should take account of university
missions and size
• Better to raise the average quality of the whole system
22. PROPOSAL: REGIONAL
INSTITUTIONAL SURVEY
• A regional survey modelled on the Global Institutional
Profiles Project (Clarivate)
• Collect data on numbers of students, faculty, publications,
citations, patents, internationalisation, income, collaboration
• Develop data on admissions standards and quality of
teaching and graduates
• Supplement current rankings or possibly the basis of new
regional rankings
23. TO CONCLUDE
• Rankings have the potential to contribute to a common
regional educational space
• This means taking full advantage of the diversity of methods
and approaches of current rankings
• Not focussing on one or two