International university rankings are increasingly used by various stakeholders to compare higher education institutions, though they are also subject to criticism. This document provides an overview of some of the most prominent international rankings, including the THE, QS, Shanghai, and Leiden rankings. It discusses the methodology, indicators, and criticisms of each. The document also examines the positioning of Dutch universities like TU Delft in these various rankings and notes some new initiatives to improve rankings, such as U-Multirank, that aim to be more multi-dimensional and personalized.
International rankings of universities; An overview for managers M&C TU DelftKim Huijpen
Presentation on international rankings of universities by Kim Huijpen.
Presented on Monday the 17th of October 2011 to managers Marketing & Communication of Delft University of Technology .
Sheets 3, 4, 5, 6, 8, 16, 19, 20, 21, 22, 26, 27, 28 and 29 are from earlier presentations by Johan Verweij. Sheets 2 and 23 are based on sheets from earlier presentations by Johan Verweij.
“Classification and ranking in Europe” by Mr Hans HovingSATN
- The document discusses international university rankings and their impact on higher education institutions (HEIs). It notes that many HEIs feel pressure to improve their rankings and are adapting their policies and strategies as a result.
- It also discusses some of the limitations and criticisms of international rankings, such as their research bias, and the need for alternative ranking systems that consider other factors like learning outcomes.
- Several European organizations are working on developing alternative ranking systems for universities that aim to provide a more valid and fair comparison across institutions.
1) New rankings like U-Multirank and US News global ranking represent a new era of comprehensive, quality-focused multidimensional rankings that assess more than just the top 500 institutions.
2) Over the last 20 years there has been a massification of higher education and increased internationalization and globalization of knowledge economies.
3) Rankings are an imperfect but effective quality assurance tool that compare universities despite sometimes irritating institutions.
This document summarizes a presentation about U-Multirank, a new multi-dimensional ranking system for universities. U-Multirank aims to address limitations of existing rankings by ranking universities based on 30 indicators across teaching, research, knowledge transfer, international orientation, and regional engagement. It allows users to compare universities in different ways rather than providing an overall score. The presentation provides details on U-Multirank's development, launch, and inclusion of over 850 universities in its first release in 2014. Future expansion and additional benefits for participating universities are also discussed.
Urap 2014 2015 world ranking and urap tr presentation nov 12 2014 sa (1)Murat KOÇAK
The document summarizes the University Ranking by Academic Performance (URAP) 2014-2015 rankings. It describes the establishment of the URAP Research Laboratory at Middle East Technical University in 2009 to rank universities based on academic performance. It provides details on the methodology used, indicators measured, data sources, and presents the top 15 universities in the URAP world ranking and top 15 Turkish universities. Over 2000 universities were ranked overall and in 23 fields based on 6 indicators of academic performance and research impact.
A scientometric perspective on university rankingNees Jan van Eck
This document discusses responsible use of university rankings. It summarizes a presentation given by Nees Jan van Eck of CWTS about their Leiden university ranking methodology. The presentation outlines principles for responsible ranking design, interpretation, and use. It emphasizes using transparent, field-normalized bibliometric indicators to measure research impact rather than composite scores. Comparisons should consider size and subject differences between universities. Ranks are less important than underlying indicator values. Non-research metrics are also important to consider.
Organisational complexity as a challenge to research assessment: a case study...ORCID, Inc
The University of Oxford faces significant challenges for research assessment due to its large size, organizational complexity, and decentralized structure. It has over 5,800 academic staff across 4 divisions and 44 separate colleges. Data is scattered across different departments and systems, and collecting consistent information for research assessment takes around 2.5 years. Assigning outputs to the correct units of assessment is difficult when research spans many disciplines. Ensuring consistent quality of case studies across the university's 31 units of assessment is also a challenge. Fully capturing the university's impact, which may occur in broad fields across departments, requires comprehensive data collection. Adopting common standards like ORCIDs could help integrate data but implementing them across the university's complex structures takes significant effort
International rankings of universities; An overview for managers M&C TU DelftKim Huijpen
Presentation on international rankings of universities by Kim Huijpen.
Presented on Monday the 17th of October 2011 to managers Marketing & Communication of Delft University of Technology .
Sheets 3, 4, 5, 6, 8, 16, 19, 20, 21, 22, 26, 27, 28 and 29 are from earlier presentations by Johan Verweij. Sheets 2 and 23 are based on sheets from earlier presentations by Johan Verweij.
“Classification and ranking in Europe” by Mr Hans HovingSATN
- The document discusses international university rankings and their impact on higher education institutions (HEIs). It notes that many HEIs feel pressure to improve their rankings and are adapting their policies and strategies as a result.
- It also discusses some of the limitations and criticisms of international rankings, such as their research bias, and the need for alternative ranking systems that consider other factors like learning outcomes.
- Several European organizations are working on developing alternative ranking systems for universities that aim to provide a more valid and fair comparison across institutions.
1) New rankings like U-Multirank and US News global ranking represent a new era of comprehensive, quality-focused multidimensional rankings that assess more than just the top 500 institutions.
2) Over the last 20 years there has been a massification of higher education and increased internationalization and globalization of knowledge economies.
3) Rankings are an imperfect but effective quality assurance tool that compare universities despite sometimes irritating institutions.
This document summarizes a presentation about U-Multirank, a new multi-dimensional ranking system for universities. U-Multirank aims to address limitations of existing rankings by ranking universities based on 30 indicators across teaching, research, knowledge transfer, international orientation, and regional engagement. It allows users to compare universities in different ways rather than providing an overall score. The presentation provides details on U-Multirank's development, launch, and inclusion of over 850 universities in its first release in 2014. Future expansion and additional benefits for participating universities are also discussed.
Urap 2014 2015 world ranking and urap tr presentation nov 12 2014 sa (1)Murat KOÇAK
The document summarizes the University Ranking by Academic Performance (URAP) 2014-2015 rankings. It describes the establishment of the URAP Research Laboratory at Middle East Technical University in 2009 to rank universities based on academic performance. It provides details on the methodology used, indicators measured, data sources, and presents the top 15 universities in the URAP world ranking and top 15 Turkish universities. Over 2000 universities were ranked overall and in 23 fields based on 6 indicators of academic performance and research impact.
A scientometric perspective on university rankingNees Jan van Eck
This document discusses responsible use of university rankings. It summarizes a presentation given by Nees Jan van Eck of CWTS about their Leiden university ranking methodology. The presentation outlines principles for responsible ranking design, interpretation, and use. It emphasizes using transparent, field-normalized bibliometric indicators to measure research impact rather than composite scores. Comparisons should consider size and subject differences between universities. Ranks are less important than underlying indicator values. Non-research metrics are also important to consider.
Organisational complexity as a challenge to research assessment: a case study...ORCID, Inc
The University of Oxford faces significant challenges for research assessment due to its large size, organizational complexity, and decentralized structure. It has over 5,800 academic staff across 4 divisions and 44 separate colleges. Data is scattered across different departments and systems, and collecting consistent information for research assessment takes around 2.5 years. Assigning outputs to the correct units of assessment is difficult when research spans many disciplines. Ensuring consistent quality of case studies across the university's 31 units of assessment is also a challenge. Fully capturing the university's impact, which may occur in broad fields across departments, requires comprehensive data collection. Adopting common standards like ORCIDs could help integrate data but implementing them across the university's complex structures takes significant effort
This document discusses global university rankings and compares different ranking systems. It finds that while rankings can influence universities, they can also be manipulated and criticized by academics. The document analyzes the top 10 universities according to different ranking organizations in 2009 and attempts to empirically derive a "true" ranking by minimizing distances between the rankings. It also discusses factors impacting research output at Turkish universities and problems with current ranking methodologies.
The document discusses the University Ranking by Academic Performance (URAP) project. URAP ranks 2000 universities globally and 125 Turkish universities based on academic performance indicators collected from sources like ISI and SCOPUS. The top universities are listed for indicators like number of articles, citations, and international collaboration. Harvard University ranks first in the overall URAP world ranking. The methodology, data collection, scoring, and results of the URAP 2012-2013 rankings are presented.
The Institutional Profiles project collects multidimensional data from over 1,000 leading academic institutions globally to profile their activities and performance. Data is gathered through an annual academic reputation survey, direct collection from institutions, and bibliometric sources. Institutions provide detailed information on areas like staff, students, degrees, funding, and subject-level activities. The data aims to be high-quality, internationally comparable, and minimize workload for institutions. Analysis benchmarks data to account for subject differences and allows custom comparison of institutions' key performance indicators, trends, and relationships to peers. The Institutional Profiles provide an excellent resource for exploring academic institutions and understanding their competencies.
Presentation to DC Higher Education Group on the State of the HumanitiesRobert Townsend
Presentation to the DC Higher Education Group on the state of the humanities, with recent findings from the department survey and Humanities Indicators.
A scientometric perspective on university rankingLudo Waltman
This document discusses responsible use of university rankings, using the CWTS Leiden Ranking as an example. It outlines principles for ranking design, interpretation, and use. Key points include distinguishing size-dependent and size-independent indicators, acknowledging uncertainty, and considering values beyond ranks. Rankings provide valuable but limited information and should not be used simplistically or as a sole performance measure. Multiple dimensions of university work are not represented in rankings.
Responsible use of university rankingsLudo Waltman
1) The CWTS Leiden Ranking focuses solely on research performance and is based purely on bibliometric indicators derived from the Web of Science database, without composite scores or input from universities.
2) The 2018 edition ranked 938 universities from 55 countries that published at least 1,000 documents between 2013-2016.
3) When designing rankings, indicators should distinguish between size-dependent and size-independent metrics, universities should be consistently defined, and rankings should be transparent. Comparisons between universities require acknowledging differences between them and the uncertainty in rankings.
The document discusses the promises and challenges of technology-based assessment (TBA). It notes that existing assessment models are failing and new technology allows for more engaging, multi-sensory assessments. TBA can provide more data to improve education systems and increase the speed of assessment. Several countries and projects are implementing TBA, including developing online diagnostic assessments in Hungary. TBA faces technological challenges but may transform assessment by improving quality and speed of feedback to improve learning outcomes.
Ludo Waltman presents principles for responsible university ranking. He discusses 10 rules for ranking universities, including that one size does not fit all universities, rankings should be transparent about their methodology and data, and they should acknowledge uncertainty. He then highlights the 2019 edition of the CWTS Leiden Ranking, which newly includes indicators of open access publications and gender diversity among authors. Waltman concludes by emphasizing the social responsibility of both rankings and universities to encourage responsible behavior from rankers.
Understanding World University Rankings, Sichuan University, July 2014University of Limerick
The document discusses university league tables and rankings. It provides background on how rankings are calculated using various quality indicators and the purpose they serve for different audiences. It notes that while rankings have limitations, they are an important reference point for prospective students, partners, and governments. The document advises universities to understand how they are evaluated in rankings and focus their strategies on improving indicators that are aligned with their mission goals.
CWTS Leiden Ranking: An advanced bibliometric approach to university rankingNees Jan van Eck
This document summarizes a presentation about the CWTS Leiden Ranking, a university ranking produced by the Centre for Science and Technology Studies (CWTS) at Leiden University. It provides details about CWTS, the Leiden Ranking methodology, indicators, selection of universities, and differences from other rankings. The presentation emphasizes the importance of using bibliometric indicators, fractional counting of publications, and focusing on highly cited publications. It concludes with principles for the responsible use and interpretation of rankings to avoid simplistic comparisons and ensure rankings are used appropriately.
The document discusses university ranking systems and Pakistan's performance in various rankings. It provides an overview of popular global university rankings like QS, THE, ARWU, USNWR, and subject-specific rankings. It outlines the methodology, indicators and Pakistan's ranking in these systems. Additionally, it presents Punjab University's efforts to improve its ranking by focusing on key indicators like faculty, students, research, reputation and finances. It recommends further strategies like establishing a data center, international student office and improving ICT facilities.
ABC-PhD program-Politecnico di Milano - an update for TanzaniaEnrico DeAngelis
Ronca)
2. BIM for the management of construction sites: MARCO FERRARI (tutor
effettivo: Ronca)
3. BIM for the management of building facilities: FRANCESCO GALLI (tutor
effettivo: Ronca)
4. BIM for the management of building energy performance: FRANCESCO
MUSSO (tutor effettivo: Ronca)
5. BIM for the management of building safety: GIULIA PEDRINI (tutor
effettivo: Ronca)
6. BIM for the management of building maintenance: MATTEO RIZZI (tutor
This study surveyed scholars about their perceptions of open access journals in educational technology. It found that:
- Refereed articles and journal purpose were seen as the most important traits of any journal. Impact factor and number of readers were less important.
- The likelihood of publishing in an open access journal in the next 18 months was higher for authors who had previously published in open access journals.
- Most respondents expected to see increases in the number of open access journals, strength of peer review processes, and impact of open access journals on scholarship over time.
- The most influential journals identified were EDUCAUSE Quarterly/Review, the Australasian Journal of Educational Technology, and Educational Technology & Society.
New developments in the CWTS Leiden RankingLudo Waltman
This document discusses new developments in the CWTS Leiden Ranking. It introduces indicators of gender balance and open access publishing that have been added to the ranking. The gender indicators measure the proportion of male and female authors and authorships. The open access indicators measure the proportion of a university's publications that are openly accessible via various open access routes like gold, hybrid, green or bronze open access. The document provides examples of these new indicators for different universities and regions to demonstrate how they can provide insights into gender balance and open access practices over time.
An in-depth bibliometric perspective on China’s scientific performanceLudo Waltman
This document discusses China's scientific performance based on bibliometric analysis. It finds:
- China's scientific output and impact has grown tremendously, with its share of world publications rising from 3% in 2000 to 17% in 2015.
- Chinese research is particularly strong in physical sciences, engineering, mathematics and computer science.
- Analysis of individual institutions like Zhejiang University and Fudan University reveals their research strengths in specific micro-level research areas.
- The document advocates for responsible use of bibliometrics and more detailed analyses to provide context beyond high-level statistics.
In this presentation we show some findings on the presence of researchers indicating a University-Industry dual affiliation in different European countries
G. Martineli, New SEENET-MTP Partner: SISSA - International School for Advanc...SEENET-MTP
SISSA is a research and graduate education institution located in Trieste, Italy. It has over 30 full and associate professors and focuses research in physics, neuroscience, and mathematics. SISSA has 243 PhD students from over 30 countries. The institution publishes around 400 papers per year in high impact journals. In addition to research and education, SISSA aims to transfer technology and knowledge to society through various outreach programs and spin-off companies. SISSA also runs an interdisciplinary master's program called MCA to train the next generation of leaders at the intersection of science and business.
Lessons learned in the launching of corporate universitiesAntonio Rubio
The document discusses lessons learned from launching corporate universities. To avoid superficiality, decisions should not be based solely on politics and new models should not just involve briefing suppliers. To avoid issues with development, different training methodologies and a learning strategy are needed. To avoid problems with the working process, a common language and centralized workflow system are required before starting activities. To enhance strategic positioning, CEO commitment is key at every step along with defining a unique leadership style. To enhance the organization, a combination of physical and virtual campuses is recommended alongside certification programs and ensuring the team structure focuses on internal clients.
Indicadores nuevo modelo de aprendizaje v2Antonio Rubio
El documento propone indicadores para medir la evolución de la implementación de nuevos modelos de aprendizaje en una organización. Se definen indicadores clave relacionados con la calidad, participación, innovación, implementación y satisfacción para tres fases: 1) evaluación de la formación tradicional, 2) evaluación de la transición a nuevos modelos, y 3) evaluación de nuevos modelos ya implantados. Se especifican las métricas para cada indicador a fin de medir el progreso de la implementación de nuevos enfoques de aprendizaje basados en
This document discusses global university rankings and compares different ranking systems. It finds that while rankings can influence universities, they can also be manipulated and criticized by academics. The document analyzes the top 10 universities according to different ranking organizations in 2009 and attempts to empirically derive a "true" ranking by minimizing distances between the rankings. It also discusses factors impacting research output at Turkish universities and problems with current ranking methodologies.
The document discusses the University Ranking by Academic Performance (URAP) project. URAP ranks 2000 universities globally and 125 Turkish universities based on academic performance indicators collected from sources like ISI and SCOPUS. The top universities are listed for indicators like number of articles, citations, and international collaboration. Harvard University ranks first in the overall URAP world ranking. The methodology, data collection, scoring, and results of the URAP 2012-2013 rankings are presented.
The Institutional Profiles project collects multidimensional data from over 1,000 leading academic institutions globally to profile their activities and performance. Data is gathered through an annual academic reputation survey, direct collection from institutions, and bibliometric sources. Institutions provide detailed information on areas like staff, students, degrees, funding, and subject-level activities. The data aims to be high-quality, internationally comparable, and minimize workload for institutions. Analysis benchmarks data to account for subject differences and allows custom comparison of institutions' key performance indicators, trends, and relationships to peers. The Institutional Profiles provide an excellent resource for exploring academic institutions and understanding their competencies.
Presentation to DC Higher Education Group on the State of the HumanitiesRobert Townsend
Presentation to the DC Higher Education Group on the state of the humanities, with recent findings from the department survey and Humanities Indicators.
A scientometric perspective on university rankingLudo Waltman
This document discusses responsible use of university rankings, using the CWTS Leiden Ranking as an example. It outlines principles for ranking design, interpretation, and use. Key points include distinguishing size-dependent and size-independent indicators, acknowledging uncertainty, and considering values beyond ranks. Rankings provide valuable but limited information and should not be used simplistically or as a sole performance measure. Multiple dimensions of university work are not represented in rankings.
Responsible use of university rankingsLudo Waltman
1) The CWTS Leiden Ranking focuses solely on research performance and is based purely on bibliometric indicators derived from the Web of Science database, without composite scores or input from universities.
2) The 2018 edition ranked 938 universities from 55 countries that published at least 1,000 documents between 2013-2016.
3) When designing rankings, indicators should distinguish between size-dependent and size-independent metrics, universities should be consistently defined, and rankings should be transparent. Comparisons between universities require acknowledging differences between them and the uncertainty in rankings.
The document discusses the promises and challenges of technology-based assessment (TBA). It notes that existing assessment models are failing and new technology allows for more engaging, multi-sensory assessments. TBA can provide more data to improve education systems and increase the speed of assessment. Several countries and projects are implementing TBA, including developing online diagnostic assessments in Hungary. TBA faces technological challenges but may transform assessment by improving quality and speed of feedback to improve learning outcomes.
Ludo Waltman presents principles for responsible university ranking. He discusses 10 rules for ranking universities, including that one size does not fit all universities, rankings should be transparent about their methodology and data, and they should acknowledge uncertainty. He then highlights the 2019 edition of the CWTS Leiden Ranking, which newly includes indicators of open access publications and gender diversity among authors. Waltman concludes by emphasizing the social responsibility of both rankings and universities to encourage responsible behavior from rankers.
Understanding World University Rankings, Sichuan University, July 2014University of Limerick
The document discusses university league tables and rankings. It provides background on how rankings are calculated using various quality indicators and the purpose they serve for different audiences. It notes that while rankings have limitations, they are an important reference point for prospective students, partners, and governments. The document advises universities to understand how they are evaluated in rankings and focus their strategies on improving indicators that are aligned with their mission goals.
CWTS Leiden Ranking: An advanced bibliometric approach to university rankingNees Jan van Eck
This document summarizes a presentation about the CWTS Leiden Ranking, a university ranking produced by the Centre for Science and Technology Studies (CWTS) at Leiden University. It provides details about CWTS, the Leiden Ranking methodology, indicators, selection of universities, and differences from other rankings. The presentation emphasizes the importance of using bibliometric indicators, fractional counting of publications, and focusing on highly cited publications. It concludes with principles for the responsible use and interpretation of rankings to avoid simplistic comparisons and ensure rankings are used appropriately.
The document discusses university ranking systems and Pakistan's performance in various rankings. It provides an overview of popular global university rankings like QS, THE, ARWU, USNWR, and subject-specific rankings. It outlines the methodology, indicators and Pakistan's ranking in these systems. Additionally, it presents Punjab University's efforts to improve its ranking by focusing on key indicators like faculty, students, research, reputation and finances. It recommends further strategies like establishing a data center, international student office and improving ICT facilities.
ABC-PhD program-Politecnico di Milano - an update for TanzaniaEnrico DeAngelis
Ronca)
2. BIM for the management of construction sites: MARCO FERRARI (tutor
effettivo: Ronca)
3. BIM for the management of building facilities: FRANCESCO GALLI (tutor
effettivo: Ronca)
4. BIM for the management of building energy performance: FRANCESCO
MUSSO (tutor effettivo: Ronca)
5. BIM for the management of building safety: GIULIA PEDRINI (tutor
effettivo: Ronca)
6. BIM for the management of building maintenance: MATTEO RIZZI (tutor
This study surveyed scholars about their perceptions of open access journals in educational technology. It found that:
- Refereed articles and journal purpose were seen as the most important traits of any journal. Impact factor and number of readers were less important.
- The likelihood of publishing in an open access journal in the next 18 months was higher for authors who had previously published in open access journals.
- Most respondents expected to see increases in the number of open access journals, strength of peer review processes, and impact of open access journals on scholarship over time.
- The most influential journals identified were EDUCAUSE Quarterly/Review, the Australasian Journal of Educational Technology, and Educational Technology & Society.
New developments in the CWTS Leiden RankingLudo Waltman
This document discusses new developments in the CWTS Leiden Ranking. It introduces indicators of gender balance and open access publishing that have been added to the ranking. The gender indicators measure the proportion of male and female authors and authorships. The open access indicators measure the proportion of a university's publications that are openly accessible via various open access routes like gold, hybrid, green or bronze open access. The document provides examples of these new indicators for different universities and regions to demonstrate how they can provide insights into gender balance and open access practices over time.
An in-depth bibliometric perspective on China’s scientific performanceLudo Waltman
This document discusses China's scientific performance based on bibliometric analysis. It finds:
- China's scientific output and impact has grown tremendously, with its share of world publications rising from 3% in 2000 to 17% in 2015.
- Chinese research is particularly strong in physical sciences, engineering, mathematics and computer science.
- Analysis of individual institutions like Zhejiang University and Fudan University reveals their research strengths in specific micro-level research areas.
- The document advocates for responsible use of bibliometrics and more detailed analyses to provide context beyond high-level statistics.
In this presentation we show some findings on the presence of researchers indicating a University-Industry dual affiliation in different European countries
G. Martineli, New SEENET-MTP Partner: SISSA - International School for Advanc...SEENET-MTP
SISSA is a research and graduate education institution located in Trieste, Italy. It has over 30 full and associate professors and focuses research in physics, neuroscience, and mathematics. SISSA has 243 PhD students from over 30 countries. The institution publishes around 400 papers per year in high impact journals. In addition to research and education, SISSA aims to transfer technology and knowledge to society through various outreach programs and spin-off companies. SISSA also runs an interdisciplinary master's program called MCA to train the next generation of leaders at the intersection of science and business.
Lessons learned in the launching of corporate universitiesAntonio Rubio
The document discusses lessons learned from launching corporate universities. To avoid superficiality, decisions should not be based solely on politics and new models should not just involve briefing suppliers. To avoid issues with development, different training methodologies and a learning strategy are needed. To avoid problems with the working process, a common language and centralized workflow system are required before starting activities. To enhance strategic positioning, CEO commitment is key at every step along with defining a unique leadership style. To enhance the organization, a combination of physical and virtual campuses is recommended alongside certification programs and ensuring the team structure focuses on internal clients.
Indicadores nuevo modelo de aprendizaje v2Antonio Rubio
El documento propone indicadores para medir la evolución de la implementación de nuevos modelos de aprendizaje en una organización. Se definen indicadores clave relacionados con la calidad, participación, innovación, implementación y satisfacción para tres fases: 1) evaluación de la formación tradicional, 2) evaluación de la transición a nuevos modelos, y 3) evaluación de nuevos modelos ya implantados. Se especifican las métricas para cada indicador a fin de medir el progreso de la implementación de nuevos enfoques de aprendizaje basados en
Evolving Corporate Universities London10_11th_february_josedejuanJose De Juan
This document discusses how Gas Natural Fenosa's corporate university has transitioned from an e-learning model to an e-knowledge model based on Web 2.0 technologies and user-generated content. The corporate university previously provided 800,000 training hours to 54,000 participants annually through centrally-developed courses, but will now leverage new formats like video, wikis and blogs created by internal experts and other users. This e-knowledge approach allows for dynamic, decentralized knowledge sharing across the company and saves some costs while providing enormous added value through increased collaboration and information exchange.
U-Spring: 2016 Corporate University Global Survey ResultsBPI group
Results of BPI group's 2016 global survey on corporate universities and new methods of organizational learning. Join us in reimagining the corporate university!
CORPORATE UNIVERSITY INFRASTRUCTURE for SUCCESSKenny Ong
Evolving Corporate Universities (ECU) Webinar
December 2011
• The importance of embedding a best practice management infrastructure in the organisation structure and administrative systems at an early stage in the formation of a CU
• how a company's best practice administrative infrastructure systems ensure that the best practices are visible
• how best practices can be applied and possibly how they can also leave opportunities for innovations in relation to the company's strategic vision and objectives
"Think as a Corporate University" is a journey that departs from actual trends, shows us the new challenges of both CLO as well as her/his Team, and arrives to the pillars needed to build a succesful and meaningful learning experience.
This presentation stresses the importance of building the Corporate University driven by the corporate behavioral values synthetized in the brand.
Motorola University is the corporate university of Motorola Inc. that was established in 1974 in Chicago to provide training programs for Motorola employees. It has since expanded to provide services to Motorola's clients, suppliers, and partners. The goals of Corporate Universities like Motorola University are to organize training, promote continuous learning, support organizational change, maximize the return on education investments, and foster a common culture and loyalty. Motorola University operates through five institutes focused on quality, leadership, supply chain, engineering, and marketing to contribute to Motorola's sustained success.
The document outlines 9 steps to creating a successful corporate university: 1) determine strategic direction with senior management support; 2) define scope and stakeholders; 3) plan governance structure and funding; 4) hire appropriately skilled staff; 5) develop aligned curricula; 6) market effectively; 7) use metrics to measure success; 8) learn from best practices of other universities; 9) ensure ongoing support from senior leadership. The goal is to address business and talent needs through lifelong learning.
University Rankings 2013 per settori disciplinari: ItaliaClay Casati
La presentazione riporta alcuni dati 2013 su Italia ed Europa dei tre più accreditati sistemi WUR (World University Rankings) di valutazione delle università.
Inizia con ARWU 2013 (Academic Rankings of World Universities), prosegue con i dati di QS World University Rankings e viene completata con le classifiche Times Higher Education World University Rankings.
ARWU è più centrata su risultati accademici e di ricerca.
QS e Times tengono conto anche delle opinioni di accademici e datori di lavoro.
L’Italia rimane l’unico paese del G8 (Stati Uniti, Giappone, Germania, Francia, Regno Unito, Italia, Canada, e dal 1998 la Russia) che non ha nemmeno una università tra le prime 100 del mondo nelle 3 classifiche ARWU, QS e Times.
University Rankings,the Triple Helix Modeland Webometrics:Opening Pandora’...Han Woo PARK
This document discusses university rankings and proposes a new "Triple Helix" ranking model. It begins by introducing common university rankings and their methodologies. It then examines how webometrics data correlates with academic performance and could be used for alternative rankings. Finally, it proposes a Triple Helix ranking that evaluates university-industry-government collaboration using indicators like co-publications, citations in patents, and start-ups. Potential issues are acknowledged, like local context and unintended effects. The document argues the Triple Helix model is conceptually strong and a hybrid ranking tool could benefit multiple stakeholders if webometrics data is less biased than traditional metrics.
This document discusses the methodology and results of the Academic Ranking of World Universities (ARWU) conducted by Shanghai Jiao Tong University. It outlines the purposes of developing the ARWU, which were to evaluate Chinese universities' positions globally and measure the gap to becoming world-class. The methodology uses 6 objective indicators and internationally comparable data to rank over 1000 universities. It acknowledges issues with the methodology and ways to improve the rankings, such as addressing biases against certain fields and languages.
The document discusses the impact and implications of university rankings. It notes that while rankings aim to measure quality and compare institutions, they often reduce quality to a few quantifiable indicators and ignore important factors like teaching quality, student experience, and community engagement. As a result, rankings can distort institutions' priorities and behaviors. The document reviews research showing that rankings significantly influence students, employers, universities, governments, and academic work. Many countries are using rankings to restructure their higher education systems and concentrate resources in a small number of elite institutions.
The document discusses comparing education systems internationally and issues with ranking universities. It summarizes that rankings are often based on research output but this only measures one factor. European universities are increasing their presence in top rankings. The ideal ranking considers lifetime utility and added value of attending different schools rather than just earnings, as non-pecuniary returns may be as large as financial returns. Students should get informed, be prepared for cultural and academic shocks, and consider multiple factors and their own abilities rather than relying solely on rankings.
- International university rankings provide composite measures of academic quality, research output, teaching quality, and international reputation based on metrics like peer review surveys, citation counts, faculty-student ratios, and percentages of international students and faculty.
- While rankings are not intended as strict league tables, they have significant real-world impacts by influencing government and university policies and priorities around the world as well as student choice.
- Rankings also create competition between institutions to improve their standing through strategies like increasing research productivity, attracting top faculty and students internationally, and enhancing teaching quality and graduate employability.
International league tables comparing universities are increasingly influential but also controversial. While they provide information for various stakeholders, their methodologies are questionable and criteria may not fully capture university quality. They can also incentivize universities in unintended ways and be politically manipulated. Most argue tables should be used carefully while also working to improve teaching, research, and other outcomes that tables measure.
The document discusses new demands placed on universities and strategies for the future. It notes that universities now have increased strategic roles in research, technology transfer, and education to address globalization and knowledge-based societies. However, there are also obstacles to change, like academic conservatism and resistance to changing roles and structures. The document argues that universities must open towards society to meet new needs while balancing research, education, and innovation as key parts of their mission.
The document discusses the purpose, methodology, results, and future efforts of the Academic Ranking of World Universities (ARWU). The key points are:
1. ARWU was created to evaluate the position of Chinese universities globally and measure the gap to becoming world-class universities, which has been a goal of Chinese higher education initiatives.
2. The methodology uses objective data on alumni winning Nobel Prizes, highly cited researchers, papers in Nature and Science, and other indicators to rank over 1000 universities worldwide.
3. Results have shown the top 500 universities since 2003 and have identified rankings by region, country, and fields of study. Limitations include not representing all university functions and sizes equally.
The document outlines the methodology used in the Academic Ranking of World Universities (ARWU). It discusses the history and development of ARWU since 2003. It describes the selection criteria including alumni awards, staff awards, highly cited researchers, papers in Nature and Science, total papers, and per capita performance. It also summarizes the results, features and impact of ARWU, and considers future directions including subject rankings, improvements to the methodology, and profiling of universities.
Conferencia a cargo de Ben Sowter, jefe de la Unidad de Investigación de QS.
La conferencia se presentó en el 1er Seminario Internacional sobre Rankings en Educación Superior y E-learning organizado por la UOC.
This document discusses research performance metrics for European universities compared to North American and Asian universities. It finds that while Europe has some universities that excel in certain fields, overall North America and Asia surpass Europe in terms of numbers of universities in the top 10% of research performance across fields. The US has many globally competitive universities that dominate scores of fields, far more than all European universities combined. Several Asian universities also demonstrate broad excellence, challenging the notion that competition was only a future threat.
"League Tables: valuable market information or dangerous nonsense" - presentation by Paul Greatrix and Tony Rich at AUA conference 2007 held at the University of Nottingham
The Ohio Center of Excellence in Knowledge-enabled Computing at Wright State University:
1) Shares the second position globally in impact on the World Wide Web and has the largest academic research group in the US working on semantic web, social media, big data, and health applications.
2) Has exceptional student success with internships and jobs at top companies and a total of 100 researchers including 15 highly cited faculty and 45 PhD students, largely funded through $2M+ annually in research funding.
3) Provides world-class resources for multidisciplinary projects across information technology and domains like biomedicine, with collaboration from industry partners like Google and IBM.
Terry Anderson is Director of Canadian Institute Distance Education Research (CIDER) at Athabasca University, Canada. Olaf Zawacki-Richter is Professor of Educational Technology at Carl von Ossietzky University of Oldenburg in Germany.
This shared presentation was delivered as part of the shared keynote speech at the 2014 EDEN Annual Conference in Zagreb.
http://www.eden-online.org
The document summarizes the results of a study that evaluated the research competitiveness and subject rankings of world-class universities and research institutions in 2009. Key findings include:
1) The US, UK, and Japan had the most universities ranked among the top 100, 200, and 300. Chinese universities were not ranked in the top 100 and only a few were ranked in the top 200-400.
2) Chinese research competitiveness improved but still lags behind other countries in areas like high-quality papers, Nobel Prizes, and research innovation/hot papers.
3) The study provides rankings of countries/regions and universities in different subjects to identify strengths and gaps, aiming to help universities strengthen top subjects
This document summarizes key points from a presentation on ranking universities and evaluating research performance. It discusses the challenges of different ranking and evaluation methods, including peer review, bibliometric analysis, and limitations of each. It also presents several findings from benchmarking and analysis, including that larger, top-performing universities can maintain high research performance across a broad range of activities, and that lower-performing universities receive a cumulative advantage in citations from increasing size.
Towards deeper and more sustained implementation of Assessment for learningDavid Carless
This document summarizes presentations from a symposium on Assessment for Learning (AfL) in higher education. It discusses four papers that will be presented: 1) scaling up AfL, 2) using rubrics to support AfL, 3) technology-enabled AfL, and 4) a discussant's perspectives. Some key themes that emerged were the need for critical perspectives beyond individual enthusiasm, larger implementation studies, and more longitudinal research on AfL. The document also reviews definitions of AfL, strategies for implementing it, factors that facilitate or inhibit scaling it up successfully, and the potential role of leadership, resources, communities of practice, and technology in enhancing AfL.
AIEA 2011 Presentation: Joint Degrees and Offshore Operations: An Internation...AEINorthAmerica
The document summarizes a presentation on international joint and double degree programs from a survey of institutions. Key points include: most institutions established their first joint/double degree program after 2000 and see them as part of internationalization strategies; the top three motivations were raising international visibility, advancing internationalization, and strengthening research collaborations; and the top three outcomes were greater faculty collaboration, increased visibility, and internationalization. The presentation also provided preliminary results on participating institutions and disciplines, as well as future plans. It concluded by thanking attendees and providing contact information for following up.
Similar to University rankings; an overview for the municipality of Delft July 2013 (20)
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
A Free 200-Page eBook ~ Brain and Mind Exercise.pptxOH TEIK BIN
(A Free eBook comprising 3 Sets of Presentation of a selection of Puzzles, Brain Teasers and Thinking Problems to exercise both the mind and the Right and Left Brain. To help keep the mind and brain fit and healthy. Good for both the young and old alike.
Answers are given for all the puzzles and problems.)
With Metta,
Bro. Oh Teik Bin 🙏🤓🤔🥰
This presentation was provided by Rebecca Benner, Ph.D., of the American Society of Anesthesiologists, for the second session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session Two: 'Expanding Pathways to Publishing Careers,' was held June 13, 2024.
Leveraging Generative AI to Drive Nonprofit InnovationTechSoup
In this webinar, participants learned how to utilize Generative AI to streamline operations and elevate member engagement. Amazon Web Service experts provided a customer specific use cases and dived into low/no-code tools that are quick and easy to deploy through Amazon Web Service (AWS.)
Andreas Schleicher presents PISA 2022 Volume III - Creative Thinking - 18 Jun...EduSkills OECD
Andreas Schleicher, Director of Education and Skills at the OECD presents at the launch of PISA 2022 Volume III - Creative Minds, Creative Schools on 18 June 2024.
🔥🔥🔥🔥🔥🔥🔥🔥🔥
إضغ بين إيديكم من أقوى الملازم التي صممتها
ملزمة تشريح الجهاز الهيكلي (نظري 3)
💀💀💀💀💀💀💀💀💀💀
تتميز هذهِ الملزمة بعِدة مُميزات :
1- مُترجمة ترجمة تُناسب جميع المستويات
2- تحتوي على 78 رسم توضيحي لكل كلمة موجودة بالملزمة (لكل كلمة !!!!)
#فهم_ماكو_درخ
3- دقة الكتابة والصور عالية جداً جداً جداً
4- هُنالك بعض المعلومات تم توضيحها بشكل تفصيلي جداً (تُعتبر لدى الطالب أو الطالبة بإنها معلومات مُبهمة ومع ذلك تم توضيح هذهِ المعلومات المُبهمة بشكل تفصيلي جداً
5- الملزمة تشرح نفسها ب نفسها بس تكلك تعال اقراني
6- تحتوي الملزمة في اول سلايد على خارطة تتضمن جميع تفرُعات معلومات الجهاز الهيكلي المذكورة في هذهِ الملزمة
واخيراً هذهِ الملزمة حلالٌ عليكم وإتمنى منكم إن تدعولي بالخير والصحة والعافية فقط
كل التوفيق زملائي وزميلاتي ، زميلكم محمد الذهبي 💊💊
🔥🔥🔥🔥🔥🔥🔥🔥🔥
How Barcodes Can Be Leveraged Within Odoo 17Celine George
In this presentation, we will explore how barcodes can be leveraged within Odoo 17 to streamline our manufacturing processes. We will cover the configuration steps, how to utilize barcodes in different manufacturing scenarios, and the overall benefits of implementing this technology.
This presentation was provided by Racquel Jemison, Ph.D., Christina MacLaughlin, Ph.D., and Paulomi Majumder. Ph.D., all of the American Chemical Society, for the second session of NISO's 2024 Training Series "DEIA in the Scholarly Landscape." Session Two: 'Expanding Pathways to Publishing Careers,' was held June 13, 2024.
University rankings; an overview for the municipality of Delft July 2013
1. 1Challenge the future
International Rankings of Universities
An overview for the municipality of Delft
Kim Huijpen, Corporate Policy Affairs | 04/07/13
2. 2Challenge the future
About me
Kim Huijpen
• Policy Advisor TU Delft
• Strategic Development / Corporate Policy Affairs
• Member of the Delft city council
• Committee Society and Housing
• Member of D66
3. 3Challenge the future
International university rankings
1. Context
2. Criticism
3. Overview
4. The position of the TU Delft in important rankings
5. New initiatives to improve rankings
6. How do we use international rankings?
And how do you use international university rankings?
4. 4Challenge the future
Context
Rankings fill in a need
• Stakeholders – students, parents, governments, accreditation
councils, industry (inter)national organizations – want to
know the differences between HEI’s and how they perform
Rankings are more and more used (directly or indirectly
via reputation)
• By the media
• By governmental institutions (reallocation of funds)
• By students (Asia)
• By HEI‟s themselves! For marketing purposes or to select
partners for cooperation
• By local governments?
5. 5Challenge the future
International rankings, criticism
and new developments
Most important international rankings in 2013
• QS-, THE-, Shanghai-, Leiden-ranking
• Not 4 rankings, but much more (also subject & reputation rankings)
Criticism
• Content: bias for big & old universities, focus on research, bias for
natural & medical sciences, language bias, comparison of whole HEI‟s
• Methodology: adding up all kind of indicators, numbering, dubious
weighting, intransparency, institutions deliver data, methodological
changes
6. 6Challenge the future
Criticism
Conceptual
1. Some universities have an advantage: Anglo-Saxon, beta- and
medical disciplines, focus on research, big, old, general
2. You can’t compare whole universities
3. You can’t add up all the indicators
Methodology
1. Underpinning of the weight factors
2. Sensitivity for outliers: best HEI=100 (z-scores are better)
3. Methodological changes in time
Data
1. Limited or no insight in the raw data
2. Data provided by HEI’s themselves: mistakes, manipulation
7. 7Challenge the future
Overview: similarities and differences
Ranking Focus Indicators Data Time Type
THE Research
Education
Internat.
Income
Subjective
Objective
Own Research
Dbase (WoS)
Data HEI’s
Present General
Field
QS Research
Education
Internat.
Subjective
Objective
Own Research
Dbase (Scopus)
Data HEI’s
Present General
Field
Shanghai Research Objective Dbases (e.g. WoS,
Nobel-prize.org)
Past
Present
General
Field
Subject
Leiden Research Objective Dbase (WoS) Present General
HEEACT Research Objective Dbase (WoS/ESI) Present General
Field
Subject
8. 8Challenge the future
TU Delft in rankings ‘12 & spring ‘13
World
University
Rankings
Engineering/
Technology
Rankings
Other Rankings
Subject
Rankings
Top 10 UIRC Scoreboard
(3)
QS Civil & Struct. Eng. (4)
QS Chemical Eng. (10)
Top 50 QS (18)
QS Environmental
Sciences (17)
THE (32) QS Materials Science (32)
QS Mechanical Eng. (18)
QS Electrical Eng. (42)
Top 100 THE (77) Shanghai
(76-100)
THE Reputation
(51-60)
4 QS Subject Rankings
Top 200 QS (103) 2 QS Subject Rankings
Leiden (164)
Top 300 Shanghai
(201-300)
Taiwan (276)
9. 9Challenge the future
THE-ranking (with Thomson Reuters)
Fields:
•‘Engineering &
Technology’
•‘Life Sciences’
•‘Clinical, pre-clinical
& Health’
•‘Physical Science’
•‘Social Sciences’
•‘Arts & Humanities’
Ranking by field:
based on same 13
indicators with
slightly different
weights
10. 10Challenge the future
What is citation impact?
• Citation impact is one of the key indicators in most rankings
• With a citation an author acknowledges the original author,
year, title, and source of an idea in a new publication
• Citations are measures of the impact of the cited work
• Citation „cultures‟ differ between disciplines therefore we
calculate citation impact normalized for field differences
11. 11Challenge the future
Position of 3TU, LDE & IDEA
League in THE ranking
University World
University
Rankings 2012-
13 (2011)
Engineering &
Technology
Ranking 2012-
13 (2011)
TU Delft 77 (104) 32 (22)
TU Eindhoven 114 (115) -
Universiteit Twente 187 (200) -
Universiteit Leiden 64 (79) -
Erasmus Universiteit Rotterdam 72 (157) -
Imperial College London 8 (8) 10 (10)
ETH Zürich 12 (15) 8 (9)
Ecole Polytechnique* 62 (63) 29 (29)
Aachen RWTH 154 (168) -
* ParisTech exists of eleven „Grandes Ecoles Paris‟ of which Ecole Polytechnique is the most well known.
12. 12Challenge the future
THE Engineering and Technology
Universities 2012
Position THE Engineering & Technology
ranking 2012-13 (2011)
1 Caltech, US (2)
2 Princeton University, US (3)
2 MIT, US (1)
4 University of California, Berkeley, US (4)
5 University of Cambridge, VK (6) /
Stanford University, US (5)
Highest non UK/US nr. 8 ETH Zürich (Switzerland)
Highest European (non UK/US) nr. 8 ETH Zürich (Switzerland)
13. 13Challenge the future
Indicators QS ranking
Fields:
• Arts & Humanities
• Engineering & Technology
• Life Sciences & Medicine
• Natural Sciences
• Social Sciences & Management
Ranking by field:
• Based on same
indicators
• Weightings are
different
14. 14Challenge the future
Position of 3TU, LDE & IDEA
League in QS ranking
* ParisTech exists of eleven „Grandes Ecoles Paris‟ of which Ecole Polytechnique is the most well known.
University General
ranking
(2011)
Engineering
and
Technology
ranking
(2011)
Natural
Sciences
ranking
(2011)
TU Delft 103 (104) 18 (18) 91 (79)
TU Eindhoven 158 (146) 67 (61) 186 (177)
Universiteit Twente 224 (226) 101 (116) 267 (229)
Universiteit Leiden 75 (88) - 89 (80)
Erasmus Universiteit Rotterdam 99 (103) - -
Imperial College London 6 (6) 6 (6) 11 (11)
ETH Zürich 13 (18) 8 (8) 10 (10)
Ecole Polytechnique* 41 (36) 36 (36) 43 (40)
Aachen RWTH 150 (140) 30 (35) 83 (82)
16. 16Challenge the future
Indicators Shanghai-ranking (since '03)
Focus Indicators Weighting
Quality of
education
Alumni winning Nobel prizes
and fields medals
Alumni 10%
Quality of faculty Staff winning Nobel prizes and
fields medals
Award 20%
Highly cited researchers HiCI 20%
Research output Articles and papers in Nature
and Science
N&S 20%
Articles and papers in SCI and
SSCI
PUB 20%
17. 17Challenge the future
Position of 3TU, LDE & IDEA
League in Shanghai ranking
* ParisTech exists of eleven „Grandes Ecoles Paris‟ of which Ecole Polytechnique is the most well known.
University Academic Ranking of
World Universities -
2012 (2011)
TU Delft 201-300 (151-200)
Universiteit Twente 301-400 (301-400)
TU Eindhoven 301-400 (301-400)
Universiteit Leiden 73 (65)
Erasmus Universiteit Rotterdam 151-200 (151-200)
ETH Zürich 23 (23)
Imperial College London 24 (24)
Ecole Polytechnique* 301-400 (301-400)
RWTH Aachen University 201-300 (201-300)
18. 18Challenge the future
Shanghai-ranking calculated trend
242
224 234
191 194 197 193 185 197
214
0
50
100
150
200
250
300
2003 2004 2005 2006 2007 2008 2009 2010 2011 2012
Calculated positions TU Delft on Shanghai ranking
These calculated scores are based on calculations per indicator by University of Groningen
20. 20Challenge the future
Position TU Delft on impact indicators
Leiden Ranking 2010-2013
Ranking 2010 2011
2011* 2013
2013*
Leiden-ranking
(Worldwide top 500)
Old ‘Crown Indicator'
(CPP/FCSm)
123
‘Alternative Crown Indicator'
(MNCS)
176 189
99 219
168
Proportion top 10% publications
(PPtop 10%)
179
115 203
164
* Size-independent indicators, impact indicators using fractional counting & exclude publications in special types of journals
21. 21Challenge the future
Position Dutch universities Industry
Research Connections 2013
World NL Uni PP(UI collab)
1 1 Eindhoven University of Technology 15,6%
2 2 Delft University of Technology 14,0%
18 3 Wageningen University and Research Centre 10,1%
24 4 University of Twente 9,8%
137 5 Erasmus University Rotterdam 7,4%
147 6 Leiden University 7,3%
153 7 Maastricht University 7,2%
160 8 Utrecht University 7,1%
186 9 University of Groningen 6,9%
214 10 VU University Amsterdam 6,5%
253 11 Radboud University Nijmegen 6,1%
257 12 University of Amsterdam 6,1%
Or #3 with different counting
22. 22Challenge the future
New developments
Improvement of existing rankings
• More attention for education, finance and field (QS, THE)
• More representative survey on reputation (QS, THE)
• Rankings per field and subject (Shanghai, QS)
New rankings and classifications (education and third mission,
fields and subjects, ranking per indicator, no numbering)
• CHE university ranking: BSc-students
• CHE excellence ranking: MSc/PhD-students
• U-Map (CHEPS): types/profiles
• U-Multirank (CHERPA/CHE): institutional and field* rankings
* e.g. engineering
24. 24Challenge the future
What is U-Multirank?
2
Multi-dimensional
• Performance comparison not only on research but also on
education, exploitation, international orientation and regional involvement.
Multi-level
• Performance profiles based on a broad set of indicators. The performance profiles
are available at two levels: institution as a whole and underlying disciplinary fields.
Multi-stakeholder
• Designed in close consultation with stakeholders and intended for - students,
administrators, policy makers, employers, etc. - to meet their needs.
Multi-ranking
• Users can decide which areas of performance to include in the comparison of the
selected group of universities; in this way U-Multirank produces personalised
rankings.
26. 26Challenge the future
QS Best student cities
Methodology
• “Two pre-requisites have been established to identify the
cities evaluated in this exercise. The first is that each city
must have a population of over 250,000, the second that it
must be home to at least two ranked institutions. Current
calculations suggest that 98 cities in the world qualify on this
basis.”
1: Paris, 2: London, 3: Boston, 4: Melbourne , 5: Vienna
and 36: Amsterdam
27. 27Challenge the future
Indicators of QS Best student cities
• Student mix
• Student Population: as a proportion of the city‟s population
• International Volume: number of international students in the city
International Ratio: number of international students as a proportion
of all students
• Quality of living
• Mercer Quality of Living Survey 2011
• Employer activity
• Domestic Employer Popularity: Number of domestic employers who
identified one institution in the city as producing excellent graduates
• International Employer Popularity [x2]
• Affordability
• Tuition Fees [x2], Big Mac Index & Mercer Cost of Living Index
28. 28Challenge the future
Mercer's Quality of Living ranking
The Quality-of-living index
encompasses 39 different factors
within the following 10 categories:
• Political and social environment
• Economic environment
• Socio-cultural environment
• Medical and health considerations
• Schools and education
• Public services and transport
• Recreation
• Consumer goods
• Housing
• Natural environment
29. 29Challenge the future
How do we use international rankings?
Until now
• Participation in
rankings
• Internal memos
for the Executive
Board
• Annual report
• Roadmap 2020
• Website‘facts and
figures’
• Marketing and PR
30. 30Challenge the future
Messages
• More and more international rankings (need)
• Are used by several stakeholders and affect your reputation
• Are biased and have methodological drawbacks
• However, methodologies are improving
• Nevertheless, important to be in the rankings
• It is difficult for specialized universities to reach a high position
in general rankings (TU Delft: technology/engineering)
• However, field normalization is improving
• New initiatives to improve international rankings:
• U Multirank
31. 31Challenge the future
Questions and discussion:
Which rankings are relevant for the municipality of Delft ?
Which rankings do you choose for marketing purposes?
• More information:
• www.3tu.nl/uploads/media/Rankings_en_3TU.pdf
• Thanks to Johan Verweij & Jan Salden
• I elaborated on Johans presentations & used U-Multirank sheets of Jan
Kim Huijpen, Policy Advisor, TU Delft / Corporate Policy Affairs
T +31 (0)15 27 85296 | E K.Huijpen@tudelft.nl | @KimHuijpen
UIRC Scoreboard: TU Delft has even a second positionwith default “Exclude publications in special types of journals” in the Leiden ranking.UIRC 2013 scoreboard on university-industry research connections and cooperation (copublicationswith industry)4 QS Subject Rankings in top 100: Computer Science & Info Systems,QS Chemistry,QS Earth & Marine Sciences andQS Physics & Astronomy (51-100)2 QS Subject Rankings in top 200:Mathematics (101-150) & QS Education (151-200)
IndicatorenIndustry income: innovation (worth 2.5 per cent)Research income from industry (per academic staff)Research: volume, income and reputation (worth 30 per cent)Reputational survey – researchResearch income (scaled)Papers per academic and research staffCitations: research influence (worth 30 per cent)Citation impact (normalised avarage citations per paper)International outlook: staff, students and research (worth 7.5 per cent)Ratio of international to domestic staffRatio of international to domestic studentsProportion of internationally co-authored research papersTeaching: the learning environment (worth 30 per cent)Reputational survey – teachingPhD awards per academicUndergraduatesadmitted per academicIncome per academicPhD awards / bachelor awards
This ensures that institutional comparisons are “like with like” and not “apples and oranges”.Waarom: StudentenGeïnformeerde studiekeuzeKennisinstellingenPositionering, zichtbaarheid, strategische vergelijkingenBeleidsmakersInzicht in diversiteit en performanceBedrijvenPartners voor samenwerking