SlideShare a Scribd company logo
1 of 35
THE WORLD UNIVERSITY RANKINGS Kent Business School, 1 July 2010 Phil Baty  Editor Times Higher Education World University Rankings
About Times Higher Education   The weekly magazine for all higher education professionals
About www.timeshighereducation.co.uk The dedicated website for higher education news, jobs and resources
About TSL Education Times Educational Supplement Times Higher Education TES Prime TSL Events TES  HireWire
Why Rank? Rapid globalisation of higher education ,[object Object],[object Object],[object Object],[object Object],[object Object]
Why Rank? Rankings have a useful function ,[object Object],[object Object],[object Object],[object Object],[object Object]
The old THE-QS criteria   How the data were put together under the old (2004-2009) system Peer Review Citations Student-faculty ratio Recruiter review Int’l students Int’l staff
What we found in 2009 rankings ,[object Object],[object Object],[object Object],[object Object],[object Object]
What we found in 2009 rankings ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
What we found in 2009 rankings “ If the emerging nations of Asia concentrate their growing resources on a handful of institutions, tap a worldwide pool of talent, and embrace freedom of expression and freedom of inquiry, they have every prospect of success in building world-class universities. It will not happen overnight; it will take decades. But it may happen faster than ever before.” Rick Levin, President, Yale University, 2010.
But while the old rankings give us a helpful snapshot… ,[object Object],[object Object],[object Object]
Why the change? There has been strong criticism of the old QS methodology, which Times Higher Education accepts and has listened too. For example: “ Results have been highly volatile. There have been many sharp rises and falls… Fudan in China has oscillated between 72 and 195 …” Simon Marginson, University of Melbourne. “ Most people think that the main problem with the rankings is the opaque way it constructs its sample for its reputational rankings ”. Alex Usher, vice president of Educational Policy Institute, US. “ The logic behind the selection of the indicators appears obscure ”. Christopher Hood, Oxford University.
Why the change? “ The organizations who promote such ideas should be unhappy themselves, and so should any supine universities who endorse results they view as untruthful ” Andrew Oswald, professor of Economics, University of Warwick, 2007.
Specific flaws We consulted our editorial board, and they highlighted two key concerns: “ Peer review” Citations
Peer review flaws Peer review – simply a reputation survey. Inherently controversial Subjective. They reflect past, not current, performance. Based on stereotype or even ignorance. A good or bad reputation may be mindlessly replicated. But : support for reputation measure in Thomson Reuters’ opinion poll. 79 per cent said were a “must have” or “nice to have”. Reputation is crucial. Survey can bring in some measure of the things quantitative data cannot.
Peer review flaws QS achieved a tiny response rate to its survey: In 2009 only around 3,500 people responded to the survey Tiny number of responses from individual countries. In 2008, there were just 182 from Germany, and 236 from India. Lack of clarity over the questions asked. What are we judging? This is not good enough when you’re basing 40 per cent of the score on academic peer review
QS failed to take into account dramatically different citation volumes between disciplines Major bias towards hard sciences, because arts and humanities papers have much lower citation volumes No normalisation for subject Is the LSE really only 67 in the world? Citation flaws
Staff-student ratio – is it really a measure of teaching quality? International student score – no way to judge quality of the student International staff score -- ditto  Other problems
Despite major flaws, the WUR became massively influential “ The term world class universities has begun to appear in higher education discussions, in institutional mission statements, and government education policy worldwide” “ Many staffing and organisational decisions at institutions worldwide have been affected by ranking-related goals and outcomes.” “ Rankings play an important role in persuading the Government and universities to rethink core national values” US Institute for Higher Education Policy
Despite major flaws, the WUR became massively influential “ Rankings are an unmistakable reflection of global academic competition… they seem destined to be a fixture on the global education scene for years to come. Detractors notwithstanding, as they are refined and improved they can and should play an important role in helping universities get better.” Ben Wildavsky, The Great Brain Race (Princeton University Press, May 2010)
Despite major flaws, the WUR became massively influential “ University rankings are powerful. They compel public attention and shape the behaviour of universities and policy makers.. The rankings date only from 2003 and 2004 but already they are everywhere in the sector and beyond. They set university reputations”. Simon Marginson, centre for the study of higher education, University of Melbourne .
Times Higher Education’s responsibility “ The responsibility weighs heavily on our shoulders. We are very much aware that national policies and multimillion-pound decisions are influenced by the rankings…..We feel we have a duty to improve how we compile them. “ Rankings are here to stay. But we believe universities deserve a rigorous, robust and transparent set of rankings – a serious tool for the sector, not just an annual curiosity.” Ann Mroz, Editor, Times Higher Education, November 2009
Times Higher Education’s responsibility “ We will make the Times Higher Education World University Rankings the most respected, authoritative and widely cited global ranking on the market.” Ann Mroz, Editor, Times Higher Education, November 2009
What now? We have signed a deal with  Thomson Reuters , the world’s leading research data specialist, to work with us to produce a new and improved ranking methodology for 2010 and beyond, and to collect and analyse all of our rankings data. With Thomson Reuters, our editorial board, and our readers, we are working to develop a new methodology.
What now? “ In addition to unmatched data quality, Thomson Reuters provides a proven history of bibliometric expertise and analysis. We are proud that our data continues to be chosen by leading organisations around the world and we’re happy to provide insight and consultation on such a widely respected indicator,” Jonathan Adams, director of research evaluation, Thomson Reuters
What now?
Confirmed improvements: reputation survey * Third party experts, polling firm Ipsos Mori, to undertake the reputation survey * Properly targeted and carefully sampled responses – Unesco demographics * Only published researchers asked * Action-based questions * 13,388 high quality respondents, compared to just 3,500 gathered by QS in 2009
Confirmed improvements: reputation survey 13,388 responses breakdown
Confirmed improvements: research measures Thomson Reuters’  Web of Science platform provides academics and university administrators with access to the world’s leading citation databases, covering: * 12,000 of the highest-impact academic journals * More than 110,000 conference proceedings
Confirmed improvements: subject based All data collected at the subject level Six subject areas to be examined (compared to five in 2004-09): Arts and humanities Life Sciences Physical Sciences Engineering and Technology Clinical, Pre-clinical and Health Social Sciences
Confirmed improvements: More data variables ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
Confirmed improvements: The methodology
Confirmed improvements: The methodology More “academically robust” Ross Williams, Melbourne Institute “ On the whole, the THE rankings will gain enhanced credibility as a result of these new changes” Ed Byrne, vice chancellor, Monash University
Over to you ,[object Object],[object Object],[object Object],[object Object],[object Object]
Thank you.  Stay in touch. Phil Baty Times Higher Education T.  020 3194 3298 E.  phil.baty@tsleducation.com

More Related Content

What's hot

E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....
E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....
E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....eraser Juan José Calderón
 
Transforming the Education of Future Generations
Transforming the Education of Future GenerationsTransforming the Education of Future Generations
Transforming the Education of Future Generationsfsaccess
 
TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...
TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...
TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...Nader Ale Ebrahim
 
Embracing the Unexpected Challenges Posed by Liberal Education's Success
Embracing the Unexpected Challenges Posed by Liberal Education's SuccessEmbracing the Unexpected Challenges Posed by Liberal Education's Success
Embracing the Unexpected Challenges Posed by Liberal Education's SuccessBryan Alexander
 
NITLE Lines of Inquiry 2013-2014
NITLE Lines of Inquiry 2013-2014NITLE Lines of Inquiry 2013-2014
NITLE Lines of Inquiry 2013-2014NITLE
 
Lbcio2013 surveyexecutivesummary final
Lbcio2013 surveyexecutivesummary finalLbcio2013 surveyexecutivesummary final
Lbcio2013 surveyexecutivesummary finalJim Nottingham
 
Increasing university publication and citation rate
Increasing university publication and citation rateIncreasing university publication and citation rate
Increasing university publication and citation rateNader Ale Ebrahim
 
Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...
Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...
Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...Talloires Network
 
Massachusetts Digital Health Ecosystem
Massachusetts Digital Health EcosystemMassachusetts Digital Health Ecosystem
Massachusetts Digital Health EcosystemBrett Campbell
 
SIGEVOlution Spring 2007
SIGEVOlution Spring 2007SIGEVOlution Spring 2007
SIGEVOlution Spring 2007Pier Luca Lanzi
 
Massachusetts STEM Talent Pool
Massachusetts STEM Talent PoolMassachusetts STEM Talent Pool
Massachusetts STEM Talent PoolMassEHealth
 
Developing taxonomies
Developing taxonomiesDeveloping taxonomies
Developing taxonomiesnira_110
 
Massachusetts Cyber Security Ecosystem
Massachusetts Cyber Security EcosystemMassachusetts Cyber Security Ecosystem
Massachusetts Cyber Security EcosystemBrett Campbell
 
Massachusetts Collaborative Work Spaces
Massachusetts Collaborative Work SpacesMassachusetts Collaborative Work Spaces
Massachusetts Collaborative Work SpacesMassEHealth
 

What's hot (20)

Vision Project Preview: Research and Economic Activity
Vision Project Preview: Research and Economic ActivityVision Project Preview: Research and Economic Activity
Vision Project Preview: Research and Economic Activity
 
E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....
E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....
E-Learning Research in Asia during 1996–2018 and the Four Country Indicators....
 
Transforming the Education of Future Generations
Transforming the Education of Future GenerationsTransforming the Education of Future Generations
Transforming the Education of Future Generations
 
TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...
TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...
TWO DECADES OF RESEARCH ON EARLY CAREER FACULTIES (ECFs): A BIBLIOMETRIC ANAL...
 
Embracing the Unexpected Challenges Posed by Liberal Education's Success
Embracing the Unexpected Challenges Posed by Liberal Education's SuccessEmbracing the Unexpected Challenges Posed by Liberal Education's Success
Embracing the Unexpected Challenges Posed by Liberal Education's Success
 
Preparing for the digital university
Preparing for the digital university Preparing for the digital university
Preparing for the digital university
 
Vision Project "Big Three" Conference Program
Vision Project "Big Three" Conference ProgramVision Project "Big Three" Conference Program
Vision Project "Big Three" Conference Program
 
NITLE Lines of Inquiry 2013-2014
NITLE Lines of Inquiry 2013-2014NITLE Lines of Inquiry 2013-2014
NITLE Lines of Inquiry 2013-2014
 
Lbcio2013 surveyexecutivesummary final
Lbcio2013 surveyexecutivesummary finalLbcio2013 surveyexecutivesummary final
Lbcio2013 surveyexecutivesummary final
 
Increasing university publication and citation rate
Increasing university publication and citation rateIncreasing university publication and citation rate
Increasing university publication and citation rate
 
Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...
Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...
Assessing Faculty Perspectives on Rewards and Incentives for Community-Engage...
 
Massachusetts Digital Health Ecosystem
Massachusetts Digital Health EcosystemMassachusetts Digital Health Ecosystem
Massachusetts Digital Health Ecosystem
 
Further disadvantages for disadvantaged learners in England
Further disadvantages for disadvantaged learners in EnglandFurther disadvantages for disadvantaged learners in England
Further disadvantages for disadvantaged learners in England
 
Vision Project Preview: Workforce Alignment
Vision Project Preview: Workforce AlignmentVision Project Preview: Workforce Alignment
Vision Project Preview: Workforce Alignment
 
SIGEVOlution Spring 2007
SIGEVOlution Spring 2007SIGEVOlution Spring 2007
SIGEVOlution Spring 2007
 
Massachusetts STEM Talent Pool
Massachusetts STEM Talent PoolMassachusetts STEM Talent Pool
Massachusetts STEM Talent Pool
 
Developing taxonomies
Developing taxonomiesDeveloping taxonomies
Developing taxonomies
 
STEM@theTech-Preso
STEM@theTech-PresoSTEM@theTech-Preso
STEM@theTech-Preso
 
Massachusetts Cyber Security Ecosystem
Massachusetts Cyber Security EcosystemMassachusetts Cyber Security Ecosystem
Massachusetts Cyber Security Ecosystem
 
Massachusetts Collaborative Work Spaces
Massachusetts Collaborative Work SpacesMassachusetts Collaborative Work Spaces
Massachusetts Collaborative Work Spaces
 

Similar to Paper 7: Ranking Methodology of Times Higher Education (Baty)

Universties global ranking systems for pub.
Universties global ranking systems for pub.Universties global ranking systems for pub.
Universties global ranking systems for pub.Ahmed Metwaly
 
Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry
Harassing with Numbers: the Uses and Abuses of Bureaucracy and BibliometryHarassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry
Harassing with Numbers: the Uses and Abuses of Bureaucracy and BibliometryGiuseppe De Nicolao
 
Ellen Hazelkorn - Pursuing quality and excellence in higher education
Ellen Hazelkorn - Pursuing quality and excellence in higher educationEllen Hazelkorn - Pursuing quality and excellence in higher education
Ellen Hazelkorn - Pursuing quality and excellence in higher educationFundación Ramón Areces
 
Paper 1: Shanghai Ranking (Liu, Nian Cai)
Paper 1: Shanghai Ranking (Liu, Nian Cai)Paper 1: Shanghai Ranking (Liu, Nian Cai)
Paper 1: Shanghai Ranking (Liu, Nian Cai)Kent Business School
 
Malcolm Grant: Differentiation, not imitation
Malcolm Grant: Differentiation, not imitationMalcolm Grant: Differentiation, not imitation
Malcolm Grant: Differentiation, not imitationAmy Hilton
 
Strengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational communityStrengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational communityEASTWEST Public Relations
 
Strengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational communityStrengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational communityChristian Dougoud
 
International ranking system
International ranking system International ranking system
International ranking system Kirsten Siksne
 
Global Obsession With Rankings Hazelkorn
Global Obsession With Rankings HazelkornGlobal Obsession With Rankings Hazelkorn
Global Obsession With Rankings HazelkornKolds
 
Making sense of rankings
Making sense of rankingsMaking sense of rankings
Making sense of rankingsmeltonj
 
Citation and its Impact on University Ranking
Citation and its Impact on University RankingCitation and its Impact on University Ranking
Citation and its Impact on University RankingNader Ale Ebrahim
 
Saloni aul report_higher_education_management copy
Saloni aul report_higher_education_management copySaloni aul report_higher_education_management copy
Saloni aul report_higher_education_management copySaloni Aul
 
What makes a university world class
What makes a university world classWhat makes a university world class
What makes a university world classteacheremmanuel
 
Going Global: Are graduates prepared for a global workforce
Going Global: Are graduates prepared for a global workforce Going Global: Are graduates prepared for a global workforce
Going Global: Are graduates prepared for a global workforce The Economist Media Businesses
 
Japans University Ranking System.pptx
Japans University Ranking System.pptxJapans University Ranking System.pptx
Japans University Ranking System.pptxLenaZwatz
 

Similar to Paper 7: Ranking Methodology of Times Higher Education (Baty) (20)

Kazak2.ppt
Kazak2.pptKazak2.ppt
Kazak2.ppt
 
Universties global ranking systems for pub.
Universties global ranking systems for pub.Universties global ranking systems for pub.
Universties global ranking systems for pub.
 
Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry
Harassing with Numbers: the Uses and Abuses of Bureaucracy and BibliometryHarassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry
Harassing with Numbers: the Uses and Abuses of Bureaucracy and Bibliometry
 
Ellen Hazelkorn - Pursuing quality and excellence in higher education
Ellen Hazelkorn - Pursuing quality and excellence in higher educationEllen Hazelkorn - Pursuing quality and excellence in higher education
Ellen Hazelkorn - Pursuing quality and excellence in higher education
 
Paper 1: Shanghai Ranking (Liu, Nian Cai)
Paper 1: Shanghai Ranking (Liu, Nian Cai)Paper 1: Shanghai Ranking (Liu, Nian Cai)
Paper 1: Shanghai Ranking (Liu, Nian Cai)
 
Malcolm Grant: Differentiation, not imitation
Malcolm Grant: Differentiation, not imitationMalcolm Grant: Differentiation, not imitation
Malcolm Grant: Differentiation, not imitation
 
Strengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational communityStrengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational community
 
Strengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational communityStrengthening universities' reputation in the global educational community
Strengthening universities' reputation in the global educational community
 
Links prezi
Links preziLinks prezi
Links prezi
 
International ranking system
International ranking system International ranking system
International ranking system
 
Cs faculty newsletter sep 19
Cs faculty newsletter sep 19Cs faculty newsletter sep 19
Cs faculty newsletter sep 19
 
Global Obsession With Rankings Hazelkorn
Global Obsession With Rankings HazelkornGlobal Obsession With Rankings Hazelkorn
Global Obsession With Rankings Hazelkorn
 
Making sense of rankings
Making sense of rankingsMaking sense of rankings
Making sense of rankings
 
EUA Presentation
EUA PresentationEUA Presentation
EUA Presentation
 
Citation and its Impact on University Ranking
Citation and its Impact on University RankingCitation and its Impact on University Ranking
Citation and its Impact on University Ranking
 
Saloni aul report_higher_education_management copy
Saloni aul report_higher_education_management copySaloni aul report_higher_education_management copy
Saloni aul report_higher_education_management copy
 
What makes a university world class
What makes a university world classWhat makes a university world class
What makes a university world class
 
Going Global: Are graduates prepared for a global workforce
Going Global: Are graduates prepared for a global workforce Going Global: Are graduates prepared for a global workforce
Going Global: Are graduates prepared for a global workforce
 
Opportunities for Young International Scholars in the United States (Daniel D...
Opportunities for Young International Scholars in the United States (Daniel D...Opportunities for Young International Scholars in the United States (Daniel D...
Opportunities for Young International Scholars in the United States (Daniel D...
 
Japans University Ranking System.pptx
Japans University Ranking System.pptxJapans University Ranking System.pptx
Japans University Ranking System.pptx
 

More from Kent Business School

Kent Business School Open innovation Network (Presentation: 23 January 2014) ...
Kent Business School Open innovation Network (Presentation: 23 January 2014) ...Kent Business School Open innovation Network (Presentation: 23 January 2014) ...
Kent Business School Open innovation Network (Presentation: 23 January 2014) ...Kent Business School
 
Kent Business School Open innovation Network (Presentation: 23 January 2014):...
Kent Business School Open innovation Network (Presentation: 23 January 2014):...Kent Business School Open innovation Network (Presentation: 23 January 2014):...
Kent Business School Open innovation Network (Presentation: 23 January 2014):...Kent Business School
 
Kent Business School Open innovation network (Presentation: 5th June 2013): H...
Kent Business School Open innovation network (Presentation: 5th June 2013): H...Kent Business School Open innovation network (Presentation: 5th June 2013): H...
Kent Business School Open innovation network (Presentation: 5th June 2013): H...Kent Business School
 
Paper 5: Sustainable Organisation (Metawie)
Paper 5: Sustainable Organisation (Metawie)Paper 5: Sustainable Organisation (Metawie)
Paper 5: Sustainable Organisation (Metawie)Kent Business School
 
Paper 1: Integrated Bank Performance Assessment (Yang)
Paper 1:  Integrated Bank Performance Assessment (Yang)Paper 1:  Integrated Bank Performance Assessment (Yang)
Paper 1: Integrated Bank Performance Assessment (Yang)Kent Business School
 
Paper 7: Promoting Sustainable Performance (Turpin)
Paper 7: Promoting Sustainable Performance (Turpin)Paper 7: Promoting Sustainable Performance (Turpin)
Paper 7: Promoting Sustainable Performance (Turpin)Kent Business School
 
Paper 2: Voice (Bull, Gilman & Pyman)
Paper 2: Voice (Bull, Gilman & Pyman) Paper 2: Voice (Bull, Gilman & Pyman)
Paper 2: Voice (Bull, Gilman & Pyman) Kent Business School
 
Paper 6: A Structure for Performance (Dudley)
Paper 6: A Structure for Performance (Dudley)Paper 6: A Structure for Performance (Dudley)
Paper 6: A Structure for Performance (Dudley)Kent Business School
 
Paper 1: Comparative Growth (Gilman & Raby)
Paper 1: Comparative Growth (Gilman & Raby)Paper 1: Comparative Growth (Gilman & Raby)
Paper 1: Comparative Growth (Gilman & Raby)Kent Business School
 
Paper 4: Performance Management in SMEs (Qi)
Paper 4: Performance Management in SMEs (Qi)Paper 4: Performance Management in SMEs (Qi)
Paper 4: Performance Management in SMEs (Qi)Kent Business School
 
Paper 3: Regional Innovation Policy (Gill)
Paper 3: Regional Innovation Policy (Gill)Paper 3: Regional Innovation Policy (Gill)
Paper 3: Regional Innovation Policy (Gill)Kent Business School
 
Paper 3: DEA Weights (Mar Molinero & Portillo)
Paper 3: DEA Weights (Mar Molinero & Portillo)Paper 3: DEA Weights (Mar Molinero & Portillo)
Paper 3: DEA Weights (Mar Molinero & Portillo)Kent Business School
 
Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)
Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)
Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)Kent Business School
 
Paper 1: Changing Policy for Innovation Capacity-building (Mu)
Paper 1: Changing Policy for Innovation Capacity-building (Mu)Paper 1: Changing Policy for Innovation Capacity-building (Mu)
Paper 1: Changing Policy for Innovation Capacity-building (Mu)Kent Business School
 
Paper 9: Innovation Assessment and Improvement (D. Xu)
Paper 9: Innovation Assessment and Improvement (D. Xu)Paper 9: Innovation Assessment and Improvement (D. Xu)
Paper 9: Innovation Assessment and Improvement (D. Xu)Kent Business School
 
Paper 8: The Government as Venture Capitalist (Li)
Paper 8: The Government as Venture Capitalist (Li)Paper 8: The Government as Venture Capitalist (Li)
Paper 8: The Government as Venture Capitalist (Li)Kent Business School
 
Paper 6: Sustainable Organisation (Butler)
Paper 6: Sustainable Organisation (Butler)Paper 6: Sustainable Organisation (Butler)
Paper 6: Sustainable Organisation (Butler)Kent Business School
 
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)Kent Business School
 
Paper 4: Ethical Environment of Nano-Science (Chunliang)
Paper 4: Ethical Environment of Nano-Science (Chunliang)Paper 4: Ethical Environment of Nano-Science (Chunliang)
Paper 4: Ethical Environment of Nano-Science (Chunliang)Kent Business School
 
Paper 2: Correlation of China’s Government Procurement (Song)
Paper 2: Correlation of China’s Government Procurement (Song)Paper 2: Correlation of China’s Government Procurement (Song)
Paper 2: Correlation of China’s Government Procurement (Song)Kent Business School
 

More from Kent Business School (20)

Kent Business School Open innovation Network (Presentation: 23 January 2014) ...
Kent Business School Open innovation Network (Presentation: 23 January 2014) ...Kent Business School Open innovation Network (Presentation: 23 January 2014) ...
Kent Business School Open innovation Network (Presentation: 23 January 2014) ...
 
Kent Business School Open innovation Network (Presentation: 23 January 2014):...
Kent Business School Open innovation Network (Presentation: 23 January 2014):...Kent Business School Open innovation Network (Presentation: 23 January 2014):...
Kent Business School Open innovation Network (Presentation: 23 January 2014):...
 
Kent Business School Open innovation network (Presentation: 5th June 2013): H...
Kent Business School Open innovation network (Presentation: 5th June 2013): H...Kent Business School Open innovation network (Presentation: 5th June 2013): H...
Kent Business School Open innovation network (Presentation: 5th June 2013): H...
 
Paper 5: Sustainable Organisation (Metawie)
Paper 5: Sustainable Organisation (Metawie)Paper 5: Sustainable Organisation (Metawie)
Paper 5: Sustainable Organisation (Metawie)
 
Paper 1: Integrated Bank Performance Assessment (Yang)
Paper 1:  Integrated Bank Performance Assessment (Yang)Paper 1:  Integrated Bank Performance Assessment (Yang)
Paper 1: Integrated Bank Performance Assessment (Yang)
 
Paper 7: Promoting Sustainable Performance (Turpin)
Paper 7: Promoting Sustainable Performance (Turpin)Paper 7: Promoting Sustainable Performance (Turpin)
Paper 7: Promoting Sustainable Performance (Turpin)
 
Paper 2: Voice (Bull, Gilman & Pyman)
Paper 2: Voice (Bull, Gilman & Pyman) Paper 2: Voice (Bull, Gilman & Pyman)
Paper 2: Voice (Bull, Gilman & Pyman)
 
Paper 6: A Structure for Performance (Dudley)
Paper 6: A Structure for Performance (Dudley)Paper 6: A Structure for Performance (Dudley)
Paper 6: A Structure for Performance (Dudley)
 
Paper 1: Comparative Growth (Gilman & Raby)
Paper 1: Comparative Growth (Gilman & Raby)Paper 1: Comparative Growth (Gilman & Raby)
Paper 1: Comparative Growth (Gilman & Raby)
 
Paper 4: Performance Management in SMEs (Qi)
Paper 4: Performance Management in SMEs (Qi)Paper 4: Performance Management in SMEs (Qi)
Paper 4: Performance Management in SMEs (Qi)
 
Paper 3: Regional Innovation Policy (Gill)
Paper 3: Regional Innovation Policy (Gill)Paper 3: Regional Innovation Policy (Gill)
Paper 3: Regional Innovation Policy (Gill)
 
Paper 3: DEA Weights (Mar Molinero & Portillo)
Paper 3: DEA Weights (Mar Molinero & Portillo)Paper 3: DEA Weights (Mar Molinero & Portillo)
Paper 3: DEA Weights (Mar Molinero & Portillo)
 
Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)
Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)
Paper 7: Innovation Capacity of Chinese Manufacturing (Chen)
 
Paper 1: Changing Policy for Innovation Capacity-building (Mu)
Paper 1: Changing Policy for Innovation Capacity-building (Mu)Paper 1: Changing Policy for Innovation Capacity-building (Mu)
Paper 1: Changing Policy for Innovation Capacity-building (Mu)
 
Paper 9: Innovation Assessment and Improvement (D. Xu)
Paper 9: Innovation Assessment and Improvement (D. Xu)Paper 9: Innovation Assessment and Improvement (D. Xu)
Paper 9: Innovation Assessment and Improvement (D. Xu)
 
Paper 8: The Government as Venture Capitalist (Li)
Paper 8: The Government as Venture Capitalist (Li)Paper 8: The Government as Venture Capitalist (Li)
Paper 8: The Government as Venture Capitalist (Li)
 
Paper 6: Sustainable Organisation (Butler)
Paper 6: Sustainable Organisation (Butler)Paper 6: Sustainable Organisation (Butler)
Paper 6: Sustainable Organisation (Butler)
 
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
Paper 5: Study of the Model and Methodology for Institute Evaluation (Yang)
 
Paper 4: Ethical Environment of Nano-Science (Chunliang)
Paper 4: Ethical Environment of Nano-Science (Chunliang)Paper 4: Ethical Environment of Nano-Science (Chunliang)
Paper 4: Ethical Environment of Nano-Science (Chunliang)
 
Paper 2: Correlation of China’s Government Procurement (Song)
Paper 2: Correlation of China’s Government Procurement (Song)Paper 2: Correlation of China’s Government Procurement (Song)
Paper 2: Correlation of China’s Government Procurement (Song)
 

Paper 7: Ranking Methodology of Times Higher Education (Baty)

  • 1. THE WORLD UNIVERSITY RANKINGS Kent Business School, 1 July 2010 Phil Baty Editor Times Higher Education World University Rankings
  • 2. About Times Higher Education The weekly magazine for all higher education professionals
  • 3. About www.timeshighereducation.co.uk The dedicated website for higher education news, jobs and resources
  • 4. About TSL Education Times Educational Supplement Times Higher Education TES Prime TSL Events TES HireWire
  • 5.
  • 6.
  • 7. The old THE-QS criteria How the data were put together under the old (2004-2009) system Peer Review Citations Student-faculty ratio Recruiter review Int’l students Int’l staff
  • 8.
  • 9.
  • 10. What we found in 2009 rankings “ If the emerging nations of Asia concentrate their growing resources on a handful of institutions, tap a worldwide pool of talent, and embrace freedom of expression and freedom of inquiry, they have every prospect of success in building world-class universities. It will not happen overnight; it will take decades. But it may happen faster than ever before.” Rick Levin, President, Yale University, 2010.
  • 11.
  • 12. Why the change? There has been strong criticism of the old QS methodology, which Times Higher Education accepts and has listened too. For example: “ Results have been highly volatile. There have been many sharp rises and falls… Fudan in China has oscillated between 72 and 195 …” Simon Marginson, University of Melbourne. “ Most people think that the main problem with the rankings is the opaque way it constructs its sample for its reputational rankings ”. Alex Usher, vice president of Educational Policy Institute, US. “ The logic behind the selection of the indicators appears obscure ”. Christopher Hood, Oxford University.
  • 13. Why the change? “ The organizations who promote such ideas should be unhappy themselves, and so should any supine universities who endorse results they view as untruthful ” Andrew Oswald, professor of Economics, University of Warwick, 2007.
  • 14. Specific flaws We consulted our editorial board, and they highlighted two key concerns: “ Peer review” Citations
  • 15. Peer review flaws Peer review – simply a reputation survey. Inherently controversial Subjective. They reflect past, not current, performance. Based on stereotype or even ignorance. A good or bad reputation may be mindlessly replicated. But : support for reputation measure in Thomson Reuters’ opinion poll. 79 per cent said were a “must have” or “nice to have”. Reputation is crucial. Survey can bring in some measure of the things quantitative data cannot.
  • 16. Peer review flaws QS achieved a tiny response rate to its survey: In 2009 only around 3,500 people responded to the survey Tiny number of responses from individual countries. In 2008, there were just 182 from Germany, and 236 from India. Lack of clarity over the questions asked. What are we judging? This is not good enough when you’re basing 40 per cent of the score on academic peer review
  • 17. QS failed to take into account dramatically different citation volumes between disciplines Major bias towards hard sciences, because arts and humanities papers have much lower citation volumes No normalisation for subject Is the LSE really only 67 in the world? Citation flaws
  • 18. Staff-student ratio – is it really a measure of teaching quality? International student score – no way to judge quality of the student International staff score -- ditto Other problems
  • 19. Despite major flaws, the WUR became massively influential “ The term world class universities has begun to appear in higher education discussions, in institutional mission statements, and government education policy worldwide” “ Many staffing and organisational decisions at institutions worldwide have been affected by ranking-related goals and outcomes.” “ Rankings play an important role in persuading the Government and universities to rethink core national values” US Institute for Higher Education Policy
  • 20. Despite major flaws, the WUR became massively influential “ Rankings are an unmistakable reflection of global academic competition… they seem destined to be a fixture on the global education scene for years to come. Detractors notwithstanding, as they are refined and improved they can and should play an important role in helping universities get better.” Ben Wildavsky, The Great Brain Race (Princeton University Press, May 2010)
  • 21. Despite major flaws, the WUR became massively influential “ University rankings are powerful. They compel public attention and shape the behaviour of universities and policy makers.. The rankings date only from 2003 and 2004 but already they are everywhere in the sector and beyond. They set university reputations”. Simon Marginson, centre for the study of higher education, University of Melbourne .
  • 22. Times Higher Education’s responsibility “ The responsibility weighs heavily on our shoulders. We are very much aware that national policies and multimillion-pound decisions are influenced by the rankings…..We feel we have a duty to improve how we compile them. “ Rankings are here to stay. But we believe universities deserve a rigorous, robust and transparent set of rankings – a serious tool for the sector, not just an annual curiosity.” Ann Mroz, Editor, Times Higher Education, November 2009
  • 23. Times Higher Education’s responsibility “ We will make the Times Higher Education World University Rankings the most respected, authoritative and widely cited global ranking on the market.” Ann Mroz, Editor, Times Higher Education, November 2009
  • 24. What now? We have signed a deal with Thomson Reuters , the world’s leading research data specialist, to work with us to produce a new and improved ranking methodology for 2010 and beyond, and to collect and analyse all of our rankings data. With Thomson Reuters, our editorial board, and our readers, we are working to develop a new methodology.
  • 25. What now? “ In addition to unmatched data quality, Thomson Reuters provides a proven history of bibliometric expertise and analysis. We are proud that our data continues to be chosen by leading organisations around the world and we’re happy to provide insight and consultation on such a widely respected indicator,” Jonathan Adams, director of research evaluation, Thomson Reuters
  • 27. Confirmed improvements: reputation survey * Third party experts, polling firm Ipsos Mori, to undertake the reputation survey * Properly targeted and carefully sampled responses – Unesco demographics * Only published researchers asked * Action-based questions * 13,388 high quality respondents, compared to just 3,500 gathered by QS in 2009
  • 28. Confirmed improvements: reputation survey 13,388 responses breakdown
  • 29. Confirmed improvements: research measures Thomson Reuters’ Web of Science platform provides academics and university administrators with access to the world’s leading citation databases, covering: * 12,000 of the highest-impact academic journals * More than 110,000 conference proceedings
  • 30. Confirmed improvements: subject based All data collected at the subject level Six subject areas to be examined (compared to five in 2004-09): Arts and humanities Life Sciences Physical Sciences Engineering and Technology Clinical, Pre-clinical and Health Social Sciences
  • 31.
  • 33. Confirmed improvements: The methodology More “academically robust” Ross Williams, Melbourne Institute “ On the whole, the THE rankings will gain enhanced credibility as a result of these new changes” Ed Byrne, vice chancellor, Monash University
  • 34.
  • 35. Thank you. Stay in touch. Phil Baty Times Higher Education T. 020 3194 3298 E. phil.baty@tsleducation.com

Editor's Notes

  1. Good morning. I’m delighted to be here in such esteemed company. Apologies to Professor Liu – we were both in Beijing last week and he had to listen to a very similar presentation then! So it is confession time. I’ve have spent the last few months admitting that the world rankings my magazine has been publishing for the last six years are not fit for purpose. So we’ve ripped them up and we are starting again with an entirely new ranking system for 2010. Today I’m going to explain why rankings have become so important and what was wrong with our old rankings. I’ll explain what we’re doing to improve the rankings in our exciting new partnership with Thomson Reuters. I hope to convince you all today that we have learned important lessons, and that the major changes we are making to the rankings will make them much more rigorous, balanced, sophisticated and transparent – giving a more accurate picture of the global higher education landscape and providing a legitimate tool for the sector, rather than an annual headline-grabbing curiosity.
  2. So first of all, I’ll briefly introduce you to my magazine. This is Times Higher Education – “THE” for short. Founded in 1971 as a tabloid newspaper – previously known as the Times Higher Education Supplement. Re-launched in January 2008 as a weekly magazine for all professionals in higher education. We dropped the “supplement” from our title. Now just THE.
  3. We are also a daily website. The website was launched in 1994 – a first in UK publishing. All news, opinions, book reviews and features from weekly print magazine are published on-line every Thursday Also dedicated daily higher education news and opinions section. 100,000 unique visitors every week from all around the world. More than one million visits in 24 hours on 8 October 2009 – rankings day.
  4. And this is our parent company – TSL Education Ltd, based in Bloomsbury, London. You’ll see from the slide that all our businesses are in education. The day to day mission at Times Higher Education, which we’ve been involved in for almost 40 years, is to be an authorative and respected source of information for the global higher education community. We are accountable to that community, so it is crucial that our rankings stand up to the close scrutiny of academics and university staff. As experts on higher education working for the people in higher education, we are acutely aware that universities are extraordinarily complex organisations, which do many wonderful, life-changing and paradigm shifting things which simply can not be measured. We know that universities can not really be reduced to a crude set of numbers. So why do we rank at all?
  5. Well first of all, higher education is rapidly globalising. SLIDE We believe strongly that rankings, despite their limitations, help us understand this process.
  6. And rankings do serve a valid purpose. This slide shows a series of quotes from the US Institute of Higher Education policy on the influence of rankings. QUOTE Love them or hate them, rankings are here to stay. As governments start adjusting to a future economy driven by knowledge and innovation -- and make creating world class universities a key national economic policy priority -- the rankings can only get more and more influential.
  7. So between 2004 and 2009 all the data and analysis for the rankings were supplied by a company called QS. 40 per cent of the overall score was based on “Peer Review”. I’ve now banned that description. It was simply an opinion survey of university staff, asking them to rate the best universities. 10 per cent was based on “recruiter review” – a survey of graduate recruiters. 20 per cent was based on the SSR – that was the best proxy for teaching quality we could come up with at the time, based on the idea that smaller class sizes = better teaching. Crude, I know. 20 per cent was a measure of research excellence – looking at the number of times an academic’s published work was cited by others. 5 per cent was based on the proportion of overseas students at an institution. 5 per cent was based on the proportion of overseas staff at an institution. It was an attempt to get balance and breadth. But we have torn up the rankings and will start again. We have ended the arrangement with QS. They will have no further involvement in our rankings.
  8. For what its worth.. Some key findings… The UK does very well – the cause of a great deal of suspicion, coming from a UK-based publication! Damaged the credibility of the tables
  9. The biggest theme to emerge from 2009 was the rise of Asia. Rick Levin: “The rapid economic development of Asia since the Second World War… has altered the balance of power in the global economy and hence in geopolitics.” South Korea’s world-class university project provides about £4 billion in funding to 18 universities. China has been intensively developing its policy to build world-class universities for the last decade, with resources concentrated on its elite institutions. The number of Chinese who enrol in college each year has quintupled—rising from 1 million students in 1997 to more than 5.5 million students in 2007. gross enrolment rate for tertiary education stands at 23 percent, compared to 58 percent in Japan, 59 percent in the UK, and 82 percent in the United States
  10. So the old rankings do help us get a broad picture of the global landscape. But as I said at the start, we no longer believe that they are fit for purpose so we have scrapped them Why? They have had a lot of criticism
  11. There’s plenty more where that came from
  12. But far the most stinging attack came from Andrew Oswald. 2007, he mocked the pecking order of that year’s rankings Oxford and Cambridge were joint second. Stanford University was 19th – despite having “ garnered three times as many Nobel Prizes over the past two decades as the universities of Oxford and Cambridge did combined ” He said we should be unhappy with ourselves. And believe me we were. New editor in 2008, she put me in charge of rankings in 2009. We had a review and did not like what we saw.
  13. We convened a meeting of our editorial board, including: Drummond Bone, HE consultant and and advisor to the UK government Bahram Bekhradnia, director of the Higher Education Policy Institute Malcolm Grant, Provost of UCLSimon Marginson, Professor of HE Universtiy of Melbourne Philip Altbach, director, Center for International Higher Education, Boston College
  14. Research by Michael Bastedo from the University of Michigan found that the biggest most important factor in achieving a good reputation score was the previous year’s ranking position Is open to manipulation – the US website Inside Higher Ed reported last year on the senior administrator who revealed that her colleagues were instructed to routinely give low ratings to all other programmes other than their own institutions’. But reputation is crucial in globalised and competitive HE world. It also helps get a sense of some of the important but intangible things. And it is a measure that people want.
  15. So it is a good measure to have, but only if the measurement is done properly. And it wasn’t done properly in my view. Look at those figures. Tiny numbers. Shocking and not good enough.
  16. The other big problem we had was with the research excellence measure. By simply measuring the volume of citations against staff numbers, the old rankings took no account of the dramatically different citations habits between disciplines. Med School advantage. LSE example – could be true of Stanford too. It was 16 th in the 2009 rankings.
  17. Do SSRS really tell us much about actual teaching quality? Waiters in a restaurant? Should it be worth 20 per cent? Hard to verify. Marny Scully. University of Toronto. Demonstrated how a ratio of anywhere from 6:1 to 39:1 can be achieved from same data. No doubt manipulation goes on. International student score. A University’s ability to attract students from around the world is clearly important but... THE editorial board recently told us tales… scholarships/ busloads/ etc. bums on seats alone can’t tell us much. Same problems looking at the number of staff attracted from overseas. Awful story. University of Malay. 89 th in 2004. Next year 169 th . Data problem.
  18. But despite all these clear weaknesses with the rankings, they became hugely influential. Back to that IHEP report I mentioned earlier…
  19. I think that the key point from this is “Refined and improved”.
  20. So here we are with a clean slate. This is what the editor, Ann Mroz, has said…
  21. She has made a firm and bold commitment, and has set the bar very high indeed. Read…
  22. Thomson Reuters is the world’s leading research data specialist – proud to have them as our new data supplier
  23. The man behind the rankings on the Thomson Reuters side. Evidence Ltd. Government advisor. Framework 7, Australia, REF etc etc. Couldn’t ask for a more informed partner.
  24. Thomson Reuters has invested vast resources into creating its Global Institutional Profiles Project, with full-time staff stationed around the world to build highly sophisticated data-driven profiles of more than 1,000 universities around the world. That project is not a ranking project, but the database it builds will be the foundation stone of our rankings.
  25. Attempt at a serious piece of social science. Key points: SLIDE No scatter gun. No sign up. Invitation only. Teaching and research. First ever global teaching reputation measure Just completed it. Delighted with the results. I can reveal today, that we achieved 13,388 responses – more than four times more responses than QS achieved in 2009. And high quality responses.
  26. Really pleased with the results: Large numbers. But size is not the only thing that counts. Good mix of experience – teaching and research Good geographical spread. Particularly pleased to see such a high response rate from Asia and the Middle East Good balance of subject areas. Further statistical manipuation will be applied to iron out any response biases.
  27. Citations Data. TR owns the Web of Science. Excellent resource. But… It is not just the size and depth of the citations database, it’s the expertise and understanding of the data that comes with working directly with the data owners, Thomson Reuters. Not just buying in a lump of data! Also have the expertise to understand it. It will be normalised
  28. Previous subject tables a mere afterthought – just one measure used, reputation survey. And we’ve seen how weak that was. Now all built from subject level upwards. Reputation survey based on local, subject-specific knowledge. Fine grain. Agriculture, not Life Science Citations data normalised for discipline etc. The mix and balance of indicators we use can vary between subjects. Eg arts and humanities – less confidence in
  29. And here are the data points we are collecting directly from the universities themselves. So the rankings from 2010 will be made up of three key data sets: reputation survey results; bibliometric data already held by Thomson Reuters, and data collected by Thomson Reuters from individual institutions. And now over to the hard part – how do we pull this all together into a final methodology?
  30. So, just a week or two ago, we put out a draft methodology for consultation. Here it is in broad terms. 13 data points compared to just six in the old QS Four broad indicators – research, economic activity and innovation, international diversity, and a broad “institutional indicators” category – designed to capture things like teaching quality. The exact weightings for each of the 13 indicators, and the final weightings for the four broad indicators are yet to be determined, but you’ll see from the graph that we do think it is appropriate to give a heavier weight to the research indicators, as there is much wider acceptance regarding the vaidity and suitability of the bibliometric data we have. Reputation measures will be split into two categories – a research reputation measure to be used in the research category, and for the first time ever, a teaching reputation measure, in the “institutional” category. At present, we are planning that both reputation indicators make up no more than 20-25 per cent of the overall ranking score – compared to 50 per cent in the QS tables.
  31. So that draft methodology has gone to a group of around 50 expert advisors – editorial board and a wider Thomson Reuters platform group. The initial feedback has been very positive. Here’s two very early responses from Australia. But of course we still have a lot of work to do. In itital
  32. But to make sure we’re as accountable as it is possible to be, we need constant criticism and input. Only with the engagement of the higher education sector will we achieve a tool that is as rigorous and as transparent and as useful as the sector needs and deserves. So please use the sites and tools above to make sure you have your say and tell us what you think.