Excellence in teaching


Published on

Presentation at the ACA Seminar on 'World-Class. The brave new world of global higher education and research', Brussels, 15 October

Published in: Education
  • Be the first to comment

  • Be the first to like this

Excellence in teaching

  1. 1. Excellence in Teaching – the road towards AHELO<br />Prof. Dr. Dirk Van Damme<br />Head of the Centre for Educational Research and Innovation – OECD/EDU<br />
  2. 2. Outline<br />How to assess the teaching function: approaches and proxies<br />Analysis of how teaching is assessed and how it impacts on ranking in the THEWUR<br />AHELO: an attempt to measure learning outcomes<br />Some ‘political’ conclusions<br />2<br />
  3. 3. How to assess teaching?<br />1.<br />3<br />
  4. 4. How to assess the teaching function of universities?<br />Students and graduates<br />Participation rates in age cohort<br />Participation of specific groups<br />Success and failure rates within programmes<br />Graduation rates<br />In age cohort<br />Within certain time perspective<br />4<br />
  5. 5. Growth in university-level qualificationsApproximated by the percentage of the population that has attained tertiary-type A education in the age groups 25-34 years, 35-44 years, 45-54 years and 55-64 years (2007)<br />
  6. 6. How to assess the teaching function of universities?<br />Labour market success of graduates<br />Effective transition to the labour market<br />Unemployment after certain time<br />Employment and career prospects after 5y<br />Economic return on investment<br />Higher return = higher reward of degrees on the labour market<br />Private and public net value of degree<br />6<br />
  7. 7. How successful are students in moving from education to work?Proportion of 25-29 year-olds with tertiary degree working low skills occupations<br />Less than ¼ of tertiary graduates student do not find a job that matches their educational level<br />EAG 2010 C3.7<br />
  8. 8. How to assess the teaching function of universities?<br />Academic quality, staff/student ratio<br />Despite some attempts, reports of quality assurance agencies provide little basis for measurement<br />The Times Higher Education WUR uses staff/student ratio as a proxy for quality<br />the higher the ratio, the better the teaching/learning environment<br />8<br />
  9. 9. How to assess the teaching function of universities?<br />Academic quality, staff/student ratio<br />Other proxies used in THEWUR2010<br />Ratio PhD students versus undergraduate students<br />More PhD = more research intensive teaching<br />Ratio PhD students or PhDs versus academic staff<br />Quality of educational infrastructure, measured by ratio of income to staff<br />9<br />
  10. 10. How to assess the teaching function of universities?<br />Academic reputation<br />Reputation is a very important driver in the dynamics of higher education systems: the ‘reputation race’ (Frans van Vught)<br />Measures as a component of rankings<br />E.g.: Times Higher Education WUR: Academic Reputation Survey, a poll of some 14000 scholars for the 2010 THEWUR ranking, responsible for 50% of the Teaching score<br />10<br />
  11. 11. How to assess the teaching function of universities?<br />Academic reputation<br />Reputation is a very questionable but extremely influential and powerful indicator<br />Outcomes are very skewed: ‘the winner takes all’, like in sports or pop music<br />Time-lag<br />Popularity perception is different from quality<br />Distorted by reputation in research, and often limited to certain disciplines<br />11<br />
  12. 12. How to assess the teaching function of universities?<br />Consumer satisfaction approaches<br />CHE approach for example relies heavily on student questionnaires covering various aspects of the educational, but also other parts of their experience<br />12<br />
  13. 13. Analysis of teaching in the THEWUR<br />2.<br />13<br />
  14. 14. Analysis of the THEWUR2010<br />The outcomes of the Times Higher Education World University Rankings provide one of the best available sources for analysing the teaching function via the proxies used<br />50% reputation survey<br />15% undergraduate students to staff ratio<br />7.5% PhD degrees to undergraduate degrees ratio<br />20% PhD degrees to staff ratio<br />7.5% income versus staff ratio<br />The 100% teaching counts for 30% in the ranking<br />14<br />
  15. 15. The structure of the top 200according to the THEWUR 2010 Overall Score<br />Peaks and Plateaux<br />
  16. 16. The structure of the top 200Regions<br />N=81<br />N=82<br />N=27<br />
  17. 17. The structure of the top 200according to the THEWUR 2010 Overall Score<br />?<br />
  18. 18. The structure of the top 200Teaching – Research – Citations <br />Std=16.99<br />Std=16.29<br />Std=14.63<br />
  19. 19. Analysis of the THEWUR2010<br />The teaching score has the flattest profile and the lowest variation in scores<br />In fact, in teaching (as measured in the THEWUR) most universities are not so different<br />The research function has more variation<br />The citation score has the highest variation<br />Are these artefacts of the measurement methodology or of the reality?<br />
  20. 20. Function coherence in Top 200<br />
  21. 21. Analysis of the THEWUR2010<br />High correlation between research and teaching of .86<br />as measured in the THEWUR good research universities are in general also good teaching universities, but with important exceptions<br />But low correlation between research and citations and between teaching and citations: both .28<br />
  22. 22. Analysis of the THEWUR2010<br />Function coherence (as measured by absolute difference between teaching and research scores)<br />Is rather high over the whole ranking list<br />Is higher in North America than in Europe or Asia<br />Suggesting that in the upper part of the global HE system excellence in teaching goes hand in hand with excellence in research<br />Binding the two functions still is at the heart of the academic mission and identity<br />But probably that’s also a consequence of the choice of indicators used<br />
  23. 23. Teaching, research, citations<br />
  24. 24. Analysis of the THEWUR2010<br />But closer analysis reveals some interesting findings<br />Ranked on the teaching dimension, the capacity to translate research into citations output increases when you move down the rank<br />Meaning that less teaching oriented universities, have a slightly higher efficiency in research<br />But with an enormous variation<br />
  25. 25. Function coherence in Top 200<br />
  26. 26. Preliminary conclusions<br />We definitely need much better indicators to understand and measure the teaching function of universities<br />Resisting the development of sound measurement of teaching implicitly confirms the research dominance in rankings and reputation<br />Indicators need to go into the heart of the teaching-learning interaction and be output, not input nor process, oriented<br />26<br />
  27. 27. AHELO: an attempt to assess learning outcomes<br />3.<br />27<br />
  28. 28. The OECD AHELO feasibility study<br />What is AHELO?<br />A ground-breaking initiative to assess HE learning outcomes on an international scale, by creating measures that would be valid:<br /><ul><li> For all cultures and languages
  29. 29. And also for the diversity of HE institutions</li></ul>Why undertake the study?<br />After decades of quantitative growth in HE, consensus on the need to ensure quality for all (Athens, 2006)… but information gap on learning outcomes<br />Carry out a feasibility study to provide a proof of concept (Tokyo, 2008)<br />Why is AHELO important?<br /><ul><li>Employs a wide range of measures
  30. 30. Provides faculties, students and government agencies with a more balanced assessment of HE quality – not just research-driven rankings!
  31. 31. No sacrifice of HEIs’ missions or autonomy in their subsequent efforts to improve performance</li></ul>28<br />
  32. 32. The feasibility study at a glance<br />To evaluate whether reliable cross-national assessments of HE learning outcomes are scientifically possible and whether their implementation is feasible.<br />Goal?<br />Not a pilot, but rather a research approach to provide a proof of concept and proof of practicality.<br />What?<br />The outcomes will be used to assist countries to decide on the next steps.<br />Why?<br />Phase 1 - Development of tools: August 2010 to April 2011 <br />Phase 2 - Implementation: August 2011 to December 2012<br />When?<br />Data will be collected from a targeted population of students who are near, but before, the end of their first 3-4 year degree. <br />Who?<br />OECD’s role is to establish broad frameworks that guide international expert committees charged with instrument development in the assessment areas.<br />How?<br />29<br />
  33. 33. Multi-dimensional def° of quality<br />Addressing the needs of various users and uses<br /><ul><li>“Bottom line” of performance
  34. 34. “Value-added” to assess the quality of services
  35. 35. Contextual data to reveal best practices and problems, and to identify teaching and learning practices leading to greater outcomes</li></ul>Both in discipline-related competencies …<br /><ul><li>Easily interpretable in the context of departments and faculties ...
  36. 36. But require highly differentiated instruments</li></ul>And in generic skills<br /><ul><li>Less dependent on occupational and cultural contexts, applicable across HEIs …
  37. 37. But reflect cumulative learning outcomes and less relevant to the subject-matter competencies that are familiar to HEIs, departments or faculties</li></ul>30<br />
  38. 38. Remarks on data collection<br /><ul><li>Institutions/departments are the units of analysis, hence measures and reporting at HEI/dept level
  39. 39. No comparative data at the national level
  40. 40. Feedback to HEIs: performance profiles and contextual data, with their own results and those of other HEIs (anonymously)</li></ul>31<br />
  41. 41. AHELO: 4 strands of work<br />Generic skills strand<br />Discipline strand <br />in Engineering<br />Initial work on defining expected learning outcomes <br />through ‘Tuning’ approach.<br />+ contextual data<br />International pilot test of the US Collegiate Learning Assessment (CLA), to assess the extent to which problem-solving or critical thinking can be validly measured across different cultural, linguistic and institutional contexts.<br />+ contextual data<br />Research-based “Value-added” or “Learning gain” measurement strand<br />Discipline strand <br />in Economics<br />Several perspectives to explore the issue of value-added (conceptually, psychometrics), building on recent OECD work at school level.<br />Initial work on defining expected learning outcomes <br />through ‘Tuning’ approach.<br />+ contextual data<br />32<br />
  42. 42. AHELO tests of instruments<br />3 assessment instruments<br />Generic Skills<br />Discipline-specific skills:<br />Engineering<br />Economics<br />2 contextual surveys<br />Contextual indicators and indirect proxies of quality:<br />Student survey<br />Faculty survey<br />33<br />
  43. 43. Work to be undertaken in 2 phases<br />Generic Skills<br />Framework<br />Economics<br />Framework<br />Engineering<br />Framework<br />Frameworks<br />Phase 1 -Initial proof of concept<br />Instrument development & small-scale validation<br />Generic Skills<br />Instrument<br />Economics<br />Instrument<br />Engineering<br />Instrument<br />Contextual dimension surveys<br />Phase 2 -Scientific feasibility <br />& proof of practicality<br />Project management,<br />survey operations and <br />analyses of results<br />Implementation<br />34<br />
  44. 44. The Generic skills strand<br />The CLA Performance Task<br /><ul><li>Requires students to use an integrated set of skills:
  45. 45. critical thinking
  46. 46. analytic reasoning
  47. 47. problem solving
  48. 48. written communication</li></ul>to answer several open-ended questions about a hypothetical but realistic situation<br /><ul><li>Requires students to marshal evidence from different sources such as letters, memos, summaries of research reports, maps, diagrams, tables, …</li></ul>35<br />
  49. 49. Participating countries – Generic Skills<br />36<br />
  50. 50. The economics strand<br />Tuning-AHELO framework of learning outcomes<br />37<br />
  51. 51. Participating countries - Economics<br />38<br />
  52. 52. The engineering strand<br />Tuning-AHELO framework of learning outcomes<br />39<br />
  53. 53. Participating Countries - Engineering<br />40<br />
  54. 54. The contextual dimension: 2 surveys<br />A brief student survey (15 minutes maximum)<br />Looking at:<br /><ul><li>Demographic profile of students such as age, gender, disadvantaged groups, or socio-economic status…
  55. 55. Practices in teaching and learning such as students’ perceptions of academic challenge, clear sense of direction, quality of effort, student-faculty relationship,…
  56. 56. …</li></ul>Looking at:<br /><ul><li>Curricular design and pedagogy philosophies such as curriculum reforms integrating application and problem solving skills, expectations for teaching practices, …
  57. 57. Alternative instructional settings such as workplace placements or internships, simulations or problem-based learning…
  58. 58. …</li></ul>Contextual data to better interpret resulting outcomes<br />A brief faculty survey (15 minutes maximum )<br />41<br />
  59. 59. Participating countries –All strands<br />42<br />
  60. 60. Practical considerations<br /><ul><li>Test of practicality of implementation: international standards for test administration and student participation rates within HEIs
  61. 61. Assessments will be computer-delivered or web-based (phase 2)
  62. 62. Performance described through proficiency levels and “can-do” statements
  63. 63. Feedback to HEIs: performance profiles and contextual data, with their own results and those of other HEIs (anonymously)</li></ul>43<br />
  64. 64. A study with great potential…<br />… Diagnosis is the basis of any improvement<br />Better information on student learning outcomes is the first step to <br />improve teachingand learning for all:<br /> Provide evidence for national and institutional policy and practice<br /> Equip institutions with the method and tools to improve teaching<br />… Shaping the future of higher education to address key challenges<br />Equity<br />Build fairer higher education systems, promoting success for all<br />Responsiveness<br />Better connect higher education and society<br />Effectiveness<br />Help students make informed choices to ensure success for all<br />Impact<br />Foster international transparency and mobility<br />44<br />
  65. 65. Some ‘political’ conclusions<br />We need to balance measurement of universities’ qualities by acknowledging teaching<br />Indicators and measurements are never neutral, but become benchmarks and policy goals<br />We need to reward institutions for the added value in the essence of the teaching-learning<br />No doubt, the added value in terms of knowledge and skills is the essence<br />The indicators and measurement systems in higher education have become inferior to those for school education, so progress is urgently needed<br />45<br />
  66. 66. Thank you !<br />dirk.vandamme@oecd.org<br />www.oecd.org/edu/ceri<br />www.oecd.org/edu/ahelo<br />46<br />