Excellence in Teaching – the road towards AHELOProf. Dr. Dirk Van DammeHead of the Centre for Educational Research and Innovation – OECD/EDU
OutlineHow to assess the teaching function: approaches and proxiesAnalysis of how teaching is assessed and how it impacts on ranking in the THEWURAHELO: an attempt to measure learning outcomesSome ‘political’ conclusions2
How to assess teaching?1.3
How to assess the teaching function of universities?Students and graduatesParticipation rates in age cohortParticipation of specific groupsSuccess and failure rates within programmesGraduation ratesIn age cohortWithin certain time perspective4
Growth in university-level qualificationsApproximated by the percentage of the population that has attained tertiary-type  A education  in the age groups 25-34 years, 35-44 years, 45-54 years and 55-64 years (2007)
How to assess the teaching function of universities?Labour market success of graduatesEffective transition to the labour marketUnemployment after certain timeEmployment and career prospects after 5yEconomic return on investmentHigher return = higher reward of degrees on the labour marketPrivate and public net value of degree6
How successful are students in moving from education to work?Proportion of 25-29 year-olds with tertiary degree working low skills occupationsLess than ¼ of tertiary graduates student do not find a job that matches their educational levelEAG 2010 C3.7
How to assess the teaching function of universities?Academic quality, staff/student ratioDespite some attempts, reports of quality assurance agencies provide little basis for measurementThe Times Higher Education WUR uses staff/student ratio as a proxy for qualitythe higher the ratio, the better the teaching/learning environment8
How to assess the teaching function of universities?Academic quality, staff/student ratioOther proxies used in THEWUR2010Ratio PhD students versus undergraduate studentsMore PhD = more research intensive teachingRatio PhD students or PhDs versus academic staffQuality of educational infrastructure, measured by ratio of income to staff9
How to assess the teaching function of universities?Academic reputationReputation is a very important driver in the dynamics of higher education systems: the ‘reputation race’ (Frans van Vught)Measures as a component of rankingsE.g.: Times Higher Education WUR: Academic Reputation Survey, a poll of some 14000 scholars for the 2010 THEWUR ranking, responsible for 50% of the Teaching score10
How to assess the teaching function of universities?Academic reputationReputation is a very questionable but extremely influential and powerful indicatorOutcomes are very skewed: ‘the winner takes all’, like in sports or pop musicTime-lagPopularity perception is different from qualityDistorted by reputation in research, and often limited to certain disciplines11
How to assess the teaching function of universities?Consumer satisfaction approachesCHE approach for example relies heavily on student questionnaires covering various aspects of the educational, but also other parts of their experience12
Analysis of teaching in the THEWUR2.13
Analysis of the THEWUR2010The outcomes of the Times Higher Education World University Rankings provide one of the best available sources for analysing the teaching function via the proxies used50% reputation survey15% undergraduate students to staff ratio7.5% PhD degrees to undergraduate degrees ratio20% PhD degrees to staff ratio7.5% income versus staff ratioThe 100% teaching counts for 30% in the ranking14
The structure of the top 200according to the THEWUR 2010 Overall ScorePeaks and Plateaux
The structure of the top 200RegionsN=81N=82N=27
The structure of the top 200according to the THEWUR 2010 Overall Score?
The structure of the top 200Teaching – Research – Citations Std=16.99Std=16.29Std=14.63
Analysis of the THEWUR2010The teaching score has the flattest profile and the lowest variation in scoresIn fact, in teaching (as measured in the THEWUR) most universities are not so differentThe research function has more variationThe citation score has the highest variationAre these artefacts of the measurement methodology or of the reality?
Function coherence in Top 200
Analysis of the THEWUR2010High correlation between research and teaching of .86as measured in the THEWUR good research universities are in general also good teaching universities, but with important exceptionsBut low correlation between research and citations and between teaching and citations: both .28
Analysis of the THEWUR2010Function coherence (as measured by absolute difference between teaching and research scores)Is rather high over the whole ranking listIs higher in North America than in Europe or AsiaSuggesting that in the upper part of the global HE system excellence in teaching goes hand in hand with excellence in researchBinding the two functions still is at the heart of the academic mission and identityBut probably that’s also a consequence of the choice of indicators used
Teaching, research, citations
Analysis of the THEWUR2010But closer analysis reveals some interesting findingsRanked on the teaching dimension, the capacity to translate research into citations output increases when you move down the rankMeaning that less teaching oriented universities, have a slightly higher efficiency in researchBut with an enormous variation
Function coherence in Top 200
Preliminary conclusionsWe definitely need much better indicators to understand and measure the teaching function of universitiesResisting the development of sound measurement of teaching implicitly confirms the research dominance in rankings and reputationIndicators need to go into the heart of the teaching-learning interaction and be output, not input nor process, oriented26
AHELO: an attempt to assess learning outcomes3.27
The OECD AHELO feasibility studyWhat is AHELO?A ground-breaking initiative to assess HE learning outcomes on an international scale, by creating measures that would be valid:   For all cultures and languages
   And also for the diversity of HE institutionsWhy undertake the study?After decades of quantitative growth in HE, consensus on the need to ensure quality for all (Athens, 2006)… but information gap on learning outcomesCarry out a feasibility study to provide a proof of concept (Tokyo, 2008)Why is AHELO important?Employs a wide range of measures
Provides faculties, students and government agencies with a more  balanced assessment of HE quality – not just research-driven rankings!
No sacrifice of HEIs’ missions or autonomy in their subsequent efforts to improve performance28
The feasibility study at a glanceTo evaluate whether reliable cross-national assessments of HE learning outcomes are scientifically possible and whether their implementation is feasible.Goal?Not a pilot, but rather a research approach to provide a proof of concept and proof of practicality.What?The outcomes will be used to assist countries to decide on the next steps.Why?Phase 1 - Development of tools:  August 2010 to April 2011 Phase 2 - Implementation: August 2011 to December 2012When?Data will be collected from a targeted population of students who are near, but before, the end of their first 3-4 year degree. Who?OECD’s role is to establish broad frameworks that guide international expert committees charged with instrument development in the assessment areas.How?29
Multi-dimensional def° of qualityAddressing the needs of various users and uses“Bottom line” of performance
“Value-added” to assess the quality of services
Contextual data to reveal best practices and problems, and to identify teaching and learning practices leading to greater outcomesBoth in discipline-related competencies …Easily interpretable in the context of departments and faculties ...
But require highly differentiated instrumentsAnd in generic skillsLess dependent on occupational and cultural contexts, applicable across HEIs …
But reflect cumulative learning outcomes and less relevant to the subject-matter competencies that are familiar to HEIs, departments or faculties30
Remarks on data collectionInstitutions/departments are the units of analysis, hence measures and 	reporting at HEI/dept level
No comparative data at the national level
Feedback to HEIs: performance profiles  and contextual data, with their own results and those of other HEIs (anonymously)31
AHELO: 4 strands of workGeneric skills strandDiscipline strand in EngineeringInitial work on defining expected learning outcomes through ‘Tuning’ approach.+ contextual dataInternational pilot test of the US Collegiate Learning Assessment (CLA), to assess the extent to which problem-solving or critical thinking can be validly measured across different cultural, linguistic and institutional contexts.+ contextual dataResearch-based “Value-added” or “Learning gain” measurement strandDiscipline strand in EconomicsSeveral perspectives to explore the issue of value-added (conceptually, psychometrics), building on recent OECD work at school level.Initial work on defining expected learning outcomes through ‘Tuning’ approach.+ contextual data32
AHELO tests of instruments3 assessment instrumentsGeneric SkillsDiscipline-specific skills:EngineeringEconomics2 contextual surveysContextual indicators and indirect proxies of quality:Student surveyFaculty survey33
Work to be undertaken in 2 phasesGeneric SkillsFrameworkEconomicsFrameworkEngineeringFrameworkFrameworksPhase 1 -Initial proof of conceptInstrument development & small-scale validationGeneric SkillsInstrumentEconomicsInstrumentEngineeringInstrumentContextual dimension surveysPhase 2 -Scientific feasibility & proof of practicalityProject management,survey operations and analyses of resultsImplementation34
The Generic skills strandThe CLA Performance TaskRequires students to use an integrated set of skills:
critical thinking
analytic reasoning

Excellence in teaching

  • 1.
    Excellence in Teaching– the road towards AHELOProf. Dr. Dirk Van DammeHead of the Centre for Educational Research and Innovation – OECD/EDU
  • 2.
    OutlineHow to assessthe teaching function: approaches and proxiesAnalysis of how teaching is assessed and how it impacts on ranking in the THEWURAHELO: an attempt to measure learning outcomesSome ‘political’ conclusions2
  • 3.
    How to assessteaching?1.3
  • 4.
    How to assessthe teaching function of universities?Students and graduatesParticipation rates in age cohortParticipation of specific groupsSuccess and failure rates within programmesGraduation ratesIn age cohortWithin certain time perspective4
  • 5.
    Growth in university-levelqualificationsApproximated by the percentage of the population that has attained tertiary-type A education in the age groups 25-34 years, 35-44 years, 45-54 years and 55-64 years (2007)
  • 6.
    How to assessthe teaching function of universities?Labour market success of graduatesEffective transition to the labour marketUnemployment after certain timeEmployment and career prospects after 5yEconomic return on investmentHigher return = higher reward of degrees on the labour marketPrivate and public net value of degree6
  • 7.
    How successful arestudents in moving from education to work?Proportion of 25-29 year-olds with tertiary degree working low skills occupationsLess than ¼ of tertiary graduates student do not find a job that matches their educational levelEAG 2010 C3.7
  • 8.
    How to assessthe teaching function of universities?Academic quality, staff/student ratioDespite some attempts, reports of quality assurance agencies provide little basis for measurementThe Times Higher Education WUR uses staff/student ratio as a proxy for qualitythe higher the ratio, the better the teaching/learning environment8
  • 9.
    How to assessthe teaching function of universities?Academic quality, staff/student ratioOther proxies used in THEWUR2010Ratio PhD students versus undergraduate studentsMore PhD = more research intensive teachingRatio PhD students or PhDs versus academic staffQuality of educational infrastructure, measured by ratio of income to staff9
  • 10.
    How to assessthe teaching function of universities?Academic reputationReputation is a very important driver in the dynamics of higher education systems: the ‘reputation race’ (Frans van Vught)Measures as a component of rankingsE.g.: Times Higher Education WUR: Academic Reputation Survey, a poll of some 14000 scholars for the 2010 THEWUR ranking, responsible for 50% of the Teaching score10
  • 11.
    How to assessthe teaching function of universities?Academic reputationReputation is a very questionable but extremely influential and powerful indicatorOutcomes are very skewed: ‘the winner takes all’, like in sports or pop musicTime-lagPopularity perception is different from qualityDistorted by reputation in research, and often limited to certain disciplines11
  • 12.
    How to assessthe teaching function of universities?Consumer satisfaction approachesCHE approach for example relies heavily on student questionnaires covering various aspects of the educational, but also other parts of their experience12
  • 13.
    Analysis of teachingin the THEWUR2.13
  • 14.
    Analysis of theTHEWUR2010The outcomes of the Times Higher Education World University Rankings provide one of the best available sources for analysing the teaching function via the proxies used50% reputation survey15% undergraduate students to staff ratio7.5% PhD degrees to undergraduate degrees ratio20% PhD degrees to staff ratio7.5% income versus staff ratioThe 100% teaching counts for 30% in the ranking14
  • 15.
    The structure ofthe top 200according to the THEWUR 2010 Overall ScorePeaks and Plateaux
  • 16.
    The structure ofthe top 200RegionsN=81N=82N=27
  • 17.
    The structure ofthe top 200according to the THEWUR 2010 Overall Score?
  • 18.
    The structure ofthe top 200Teaching – Research – Citations Std=16.99Std=16.29Std=14.63
  • 19.
    Analysis of theTHEWUR2010The teaching score has the flattest profile and the lowest variation in scoresIn fact, in teaching (as measured in the THEWUR) most universities are not so differentThe research function has more variationThe citation score has the highest variationAre these artefacts of the measurement methodology or of the reality?
  • 20.
  • 21.
    Analysis of theTHEWUR2010High correlation between research and teaching of .86as measured in the THEWUR good research universities are in general also good teaching universities, but with important exceptionsBut low correlation between research and citations and between teaching and citations: both .28
  • 22.
    Analysis of theTHEWUR2010Function coherence (as measured by absolute difference between teaching and research scores)Is rather high over the whole ranking listIs higher in North America than in Europe or AsiaSuggesting that in the upper part of the global HE system excellence in teaching goes hand in hand with excellence in researchBinding the two functions still is at the heart of the academic mission and identityBut probably that’s also a consequence of the choice of indicators used
  • 23.
  • 24.
    Analysis of theTHEWUR2010But closer analysis reveals some interesting findingsRanked on the teaching dimension, the capacity to translate research into citations output increases when you move down the rankMeaning that less teaching oriented universities, have a slightly higher efficiency in researchBut with an enormous variation
  • 25.
  • 26.
    Preliminary conclusionsWe definitelyneed much better indicators to understand and measure the teaching function of universitiesResisting the development of sound measurement of teaching implicitly confirms the research dominance in rankings and reputationIndicators need to go into the heart of the teaching-learning interaction and be output, not input nor process, oriented26
  • 27.
    AHELO: an attemptto assess learning outcomes3.27
  • 28.
    The OECD AHELOfeasibility studyWhat is AHELO?A ground-breaking initiative to assess HE learning outcomes on an international scale, by creating measures that would be valid: For all cultures and languages
  • 29.
    And also for the diversity of HE institutionsWhy undertake the study?After decades of quantitative growth in HE, consensus on the need to ensure quality for all (Athens, 2006)… but information gap on learning outcomesCarry out a feasibility study to provide a proof of concept (Tokyo, 2008)Why is AHELO important?Employs a wide range of measures
  • 30.
    Provides faculties, studentsand government agencies with a more balanced assessment of HE quality – not just research-driven rankings!
  • 31.
    No sacrifice ofHEIs’ missions or autonomy in their subsequent efforts to improve performance28
  • 32.
    The feasibility studyat a glanceTo evaluate whether reliable cross-national assessments of HE learning outcomes are scientifically possible and whether their implementation is feasible.Goal?Not a pilot, but rather a research approach to provide a proof of concept and proof of practicality.What?The outcomes will be used to assist countries to decide on the next steps.Why?Phase 1 - Development of tools: August 2010 to April 2011 Phase 2 - Implementation: August 2011 to December 2012When?Data will be collected from a targeted population of students who are near, but before, the end of their first 3-4 year degree. Who?OECD’s role is to establish broad frameworks that guide international expert committees charged with instrument development in the assessment areas.How?29
  • 33.
    Multi-dimensional def° ofqualityAddressing the needs of various users and uses“Bottom line” of performance
  • 34.
    “Value-added” to assessthe quality of services
  • 35.
    Contextual data toreveal best practices and problems, and to identify teaching and learning practices leading to greater outcomesBoth in discipline-related competencies …Easily interpretable in the context of departments and faculties ...
  • 36.
    But require highlydifferentiated instrumentsAnd in generic skillsLess dependent on occupational and cultural contexts, applicable across HEIs …
  • 37.
    But reflect cumulativelearning outcomes and less relevant to the subject-matter competencies that are familiar to HEIs, departments or faculties30
  • 38.
    Remarks on datacollectionInstitutions/departments are the units of analysis, hence measures and reporting at HEI/dept level
  • 39.
    No comparative dataat the national level
  • 40.
    Feedback to HEIs:performance profiles and contextual data, with their own results and those of other HEIs (anonymously)31
  • 41.
    AHELO: 4 strandsof workGeneric skills strandDiscipline strand in EngineeringInitial work on defining expected learning outcomes through ‘Tuning’ approach.+ contextual dataInternational pilot test of the US Collegiate Learning Assessment (CLA), to assess the extent to which problem-solving or critical thinking can be validly measured across different cultural, linguistic and institutional contexts.+ contextual dataResearch-based “Value-added” or “Learning gain” measurement strandDiscipline strand in EconomicsSeveral perspectives to explore the issue of value-added (conceptually, psychometrics), building on recent OECD work at school level.Initial work on defining expected learning outcomes through ‘Tuning’ approach.+ contextual data32
  • 42.
    AHELO tests ofinstruments3 assessment instrumentsGeneric SkillsDiscipline-specific skills:EngineeringEconomics2 contextual surveysContextual indicators and indirect proxies of quality:Student surveyFaculty survey33
  • 43.
    Work to beundertaken in 2 phasesGeneric SkillsFrameworkEconomicsFrameworkEngineeringFrameworkFrameworksPhase 1 -Initial proof of conceptInstrument development & small-scale validationGeneric SkillsInstrumentEconomicsInstrumentEngineeringInstrumentContextual dimension surveysPhase 2 -Scientific feasibility & proof of practicalityProject management,survey operations and analyses of resultsImplementation34
  • 44.
    The Generic skillsstrandThe CLA Performance TaskRequires students to use an integrated set of skills:
  • 45.
  • 46.