Making reform happen and evaluating reform in education

1,442 views

Published on

Presentation for the Norwegian Education Ministry - 22 June 2011

Published in: Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,442
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
37
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • PartnershipsNetworksKnowledge brokeringInstitutional champions and thought leadersOwnershipUptake and implementationResistance to change/fatigueDuring the review visit, there were plenty of examples of such enabling structures having grown out of local social capital and a commonly perceived sense of urgency, or as a result of a particular funding arrangement for a defined period of time.
  • Complex governance structures without fluid communication channels:Vertically (national, regional)Horizontally (across government departments)
  • Emphasis on rational thinking but is this how decisions are made? How to capture after the fact explanation/revelation? How much input do drivers/funders have on process (as opposed to evidence base and research)?
  • 1) comprehensive evaluation of system and how to enhance capacity/ transparent information gaps. 2) Efforts to develop a systemic approach to innovation in VET are rare.. Only Switzerland and, to a lesser extent, Australia, can be said to have designed a systemic approach to innovation in VET. 3) Lack of a critical mass of codified, formal and research-based knowledge on VET, both at national and international levels. Knowledge brokerage institutions supporting the genesis and diffusion of innovations are scarce; knowledge based linkages between stakeholders weak. 4) Investing in VET innovations without carefully planning their evaluation should not be an option. Decisions about sustainability or scaling up of innovations cannot be taken on an informed, and eventually evidence-based, ground if there are not in place mechanisms intended to assess their effects. The innovation-related policies aimed at fostering innovations in VET cannot be assessed in the absence of feedback.
  • Explain irrationality politics
  • Making reform happen and evaluating reform in education

    1. 1. Making reform happen and evaluating reform in education:A ‘knowledge management’ approach<br />Dirk Van Damme<br />Head of the Centre for Educational Research and Innovation - OECD<br />
    2. 2. Outline<br />Knowledge management in education<br />Knowledge as part of systemic innovation<br />Making reform happen in education<br />The ‘GPS’ approach<br />Governance and knowledge<br />Conclusions and questions<br />2<br />
    3. 3. Knowledge management in education<br />A.<br />3<br />
    4. 4. Educational R&D<br />Striking findings that education, as a knowledge sector, has a very weak knowledge base itself…<br />Low levels of educational R&D (but difficulties in finding a methodology for comparable data collection)<br />Much lower than related public policy sectors such as health or social policy<br />A weak empirical research capacity…<br />Especially for quantitative research<br />And a weak link between research and policy<br />4<br />
    5. 5. Educational R&D<br />5<br />
    6. 6. Educational R&D<br />6<br />
    7. 7. Educational R&D<br />7<br />
    8. 8. Evidence in Education<br />The emergence of evidence in education…<br />Methodologically sound solutions found for measurement issues in education<br />Comparative education indicators<br />Moving from inputs to outputs and outcomes<br />The PISA shock<br />The development of feedback systems, at student, school and system levels<br />…dramatically changed the policy climate<br />8<br />
    9. 9. Evidence in Education<br />Important methodological issues and debates on what counts as (scientific research) evidence<br />(quasi-)experimental design, randomised control trials, …<br />Lack of large-scale longitudinal studies<br />Scientific ideal versus pragmatically feasible<br />Cost and capacity problems<br />Ethical issues about educational experimenting, privacy issues related to data collection<br />9<br />
    10. 10. Evidence in Education<br />Research – Policy interaction<br />Not a simple model of direct impact of evidence on policy, but mediation by all stakeholders and actors in a complex system<br />The influence of knowledge on policy making may in fact be strongest not when it comes directly from the educational research community in direct advice to policy makers but when it is filtered through actors such as print or broadcast media, lobbyists, popularisers, etc. (EU, 2007, p.5)<br />Mediation seems to be the weakest link in the knowledge chain<br />10<br />
    11. 11. Evidence in Education<br />11<br />
    12. 12. 12<br />
    13. 13. Knowledge as part of systemic innovation<br />B.<br />13<br />
    14. 14. 14<br />
    15. 15. Systemic Innovation in Education<br />Objectives<br />Investigate how systems go about change <br />Processes and stakeholder relationships<br />Knowledge Management perspective<br />14 case studies: Australia, Denmark, Germany, Hungary, Mexico, and Switzerland<br />Factors that facilitate/impede the use of evidence<br />Lessons learned about the use of evidence<br />
    16. 16. Enablers of innovation<br />Consensus building<br />Political vision<br />Research evidence<br />Brokerage: generation and dissemination of knowledge<br />Legitimating rigour/quality<br />Developing cooperation/trust<br />
    17. 17. Barriers to innovation<br />Change fatigue<br />Competing policy agendas<br />Accountability mechanisms and public policy agendas:<br />Restricted risk management<br />Short-term planning<br />
    18. 18. Model of Change in Education<br />Identification of needs<br /> What are the drivers of change?<br /><ul><li>Which stakeholders are involved?</li></ul>Evaluation & Monitoring <br />= surveillance/ judgement of outcomes<br /><ul><li>How and when?
    19. 19. What criteria are used?
    20. 20. Summative or formative purpose?
    21. 21. What are the findings?</li></ul>Identification of needs<br />Knowledge base<br />What types of knowledge?<br /><ul><li>Tacit knowledge
    22. 22. Explicit knowledge</li></ul>What knowledge sources?<br />Development of innovation<br />Evaluation & Monitoring<br />Knowledge base<br />Implementation process<br /><ul><li> Without piloting: large-scale implementation
    23. 23. With piloting: </li></ul>Small-scale implementation<br /> Monitoring/evaluation<br /> Scaling-up<br />Output<br />Outcomes<br />Implementation<br />
    24. 24. Systemic Innovation in Education<br />Research evidence is interacting with various other forms of knowledge in innovation processes in education:<br />Professional/Practitioner knowledge<br />Teachers’ core pedagogical knowledge and beliefs as part of their professional identity<br />Tacit knowledge<br />Establishing a ‘culture of evaluation’ is critically important for the success of reform and planning of next reforms<br />19<br />
    25. 25. Conclusions<br />Policy Implications<br /><ul><li>Systemic innovation as useful analytical framework </li></ul>Targeted strategy to induce system-wide change<br /><ul><li>SI as guiding principle for innovation policy
    26. 26. Establish a formalised knowledge base
    27. 27. Monitoring and evaluation
    28. 28. Support link between systems research and innovation
    29. 29. Evidence-informed dialogue with stakeholders
    30. 30. Need for formalised knowledge base
    31. 31. Losing innovation opportunities
    32. 32. Not cost effective</li></li></ul><li>Making reform happen in education<br />C.<br />21<br />
    33. 33. 22<br />
    34. 34. Key lessons on education reform<br />Making reform happen needs to:<br />Actively engage stakeholders in formulating and implementing policy responses<br />Make effective use of evidence to shape policies<br />Explain clearly underlying principles and aims of reforms<br />(from: OECD Education Policy Committee meeting at CEO level Seoul, Korea, 2008)<br />23<br />
    35. 35. Key lessons on education reform<br />More specific lessons from reform experience:<br />Policy makers need to build consensus on the aims of education reform and actively engage stakeholders, especially teachers, in formulating and implementing policy responses<br />Reforms can capitalise on external pressures as part of building a compelling case for change<br />All political players and stakeholders need to develop more realistic expectations about the pace and nature of reforms to improve outcomes<br />Reforms need to be backed by sustainable financing<br />There is some shift away from reform initiatives per se towards building self-adjusting systems with rich feedback at all levels, incentives to react and tools to strengthen capacities to deliver better outcomes<br />24<br />
    36. 36. Key lessons on education reform<br />More specific lessons from reform experience:<br />Investment is needed in change-management skills in the education system<br />Teachers need reassurance that they will be given the tools to change and recognition of their professional motivation to improve outcomes for their students<br />Evidence can be used more effectively to guide policy making, combining international benchmarks with national surveys and with inspectorates to provide a better diagnosis<br />Evidence is most helpful when it is fed back to institutions along with information and tools about how they can use the information to improve outcomes<br />“Whole-of-government” approaches can include education in more comprehensive reforms. These need effective co-ordination and overall leadership across all the relevant ministries<br />25<br />
    37. 37. Making reform happen<br />Case studies:<br />Denmark: introducing a culture of evaluation<br />Finland: introducing polytechnics into the tertiary education sector<br />Portugal: tertiary education reform and teacher education reform<br />Norway: improving lower secondary education<br />Mexico: improving schools<br />(Education and Training Policy division)<br />26<br />
    38. 38. Making reform happen<br />27<br />Mexico project video<br />
    39. 39. Making reform happen<br />Key elements relevant to Making Reform Happen in Mexico:<br />greater understanding among stakeholders of each others’ roles and perspectives<br />greater commitment to work together on lifting performance<br />stronger capacity for undertaking education policy reform<br />strong Mexican ownership of recommendations<br />obstacles to reform identified and recommendations realistic<br />long-term vision as well as immediate steps to take<br />OECD in facilitating role, bringing international expertise to support Mexico’s reforms<br />28<br />
    40. 40. The ‘GPS’ Approach<br />D.<br />29<br />
    41. 41. Leveraging the impact of knowledge<br />The intent is to mobilise and integrate evidence from data, analyses and policy advice generated over the years by the Education Directorate and to make this knowledge accessible in a systematic way<br />This service to member countries is an analogue of a ‘GPS navigation system’ which provides real-time evidence to guide the choices and trade-offs when they need it for policy decisions related to education<br />Like the GPS system, the search by any user at any time should provide selected, focused and actionable information on specific issues for policy decision making by countries. <br />(Indicators and Analysis Division)<br />30<br />
    42. 42. Leveraging the impact of knowledge<br />Overall objectives of this service to countries is:<br />to provide a real time comprehensive access EDU’s rich evidence base on educational outcomes and policies experience so that they can used for policy decisions by governments. <br />to offer a means by which countries can assess the contribution of educational reform to the growth and progress of their country, recognizing that the full impact of educational reform is cumulative over the trajectory of policy decisions<br />31<br />
    43. 43. Objective 1: Real time comprehensive access to evidence means a shift from whole documents and other sources to extracted evidence. However the link to the document will be retained<br />Evidence-link from Education Today, 2010<br />Teachers are positive about the appraisal and feedback they receive, but in some countries a significant minority or even majority of teachers have not received any in recent years: Teachers across the different systems surveyed by TALIS tend to be positive about the appraisal and feedback<br />they receive, reporting that on the whole it is fair and helpful for their work, and increases their job satisfaction. Approximately 13% of teachers surveyed by TALIS reported that they had received no feedback or evaluation in their current school in the previous five years; this average level rises to much higher levels in Ireland (26%), Italy (55%), Portugal (26%) and Spain (46%).<br />High proportions of lower secondary teachers participate in professional development but many say that they would like more: Nearly 9 teachers in 10 surveyed by TALIS reported having taken part in a structured professional development activity in the preceding 18 months, though in Denmark, the Slovak Republic and Turkey around a quarter reported no participation during that period. Despite generally high levels of participation, more than half the teachers (55%) in the TALIS countries overall say that they would have liked more professional development, and lack of suitable opportunities is a significant factor in this. Approximately a third of the surveyed teachers reported a high level of need for training to help them teach students with special learning needs. Other professional development priorities include teaching with ICT and dealing with difficult student behaviour.<br />Evidence-link: Up to Upper Secondary School, Analysis. <br />Education Today, 2010, p20<br />
    44. 44. Strand 1: Managing what we know <br /><ul><li>A digital tool to ensure OECD evidence is at your disposal at your desk when you need it and in a form that is suitable and adapted to the specific policy context in which you need it.
    45. 45. Three types of evidence – data, analyses and policy experience – will be available.
    46. 46. Work will continue to create the digital tool and to move evidence from the last three years into the data base. It is estimated that the complete data base will contain over 100,000 individual pieces of evidence which can be searched.
    47. 47. The tool will be available in 2013. Its utility is dependent on having a large enough data base for it to be useful to users. Further additions to the data base will continue adding to its value. </li></li></ul><li>Strand 1: Managing what we know <br /><ul><li>The Evidence Navigator for Education will be available on the OECD Education web site and on OECD iLibrary in early 2013.
    48. 48. All evidence can be viewed at once, or evidence that is data, analysis or policy separately.
    49. 49. The data base will have evidence extracted from all products generated by the Education Directorate (CERI, ETP, INES, IA, CELE, IMHE) and all the surveys (PISA, PIAAC, TALIS, etc) which can be searched.
    50. 50. The extracted evidence includes both published and non-published information.
    51. 51. The search results show text, tables, figures, powerpoint slide or video.
    52. 52. If you want more information, the evidence will be linked to the original source at the extracted location. For instance, if you are interested in a figure, clicking on it will take you to the page in the book or report from which it is drawn.
    53. 53. Here is a short “trailer”.</li></li></ul><li>
    54. 54.
    55. 55.
    56. 56. Strand 2: Peer-learning from policy experiences<br />Peer-learning from evaluated policies in other countries that have achieved successful outcomes<br />Policy experience analysis will be incorporated into the GPS data base<br />First volume of Strong Performers, Successful Reformers was published December 2010. <br />OECD – Japan Seminar June 28-29 2011 will include a second volume. <br />
    57. 57. Strand 3: Building tools for successful policy implementation<br />Successful policy implementation is essential for reforms that achieve targets, within budget and on schedule. Implementation tools and delivery capacity are useful in policy planning. <br />When available, such tools for managing implementation and assessing delivery capacity will be incorporated into the GPS data base for use by countries. <br />A pilot of the tool developed by McKinsey and Michael Barber in the UK has been proposed to member countries in order to test its appropriateness and value. <br />
    58. 58. Governance and knowledge<br />E.<br />40<br />
    59. 59. Governance and Knowledge<br />Governing Complex Education Systems is a new CERI project dealing with the governance challenges of increasingly complex education systems<br />Increasing number of actors and stakeholders<br />Multilevel governance issues<br />Decentralisation and re-centralisation<br />What is the role of knowledge in governance?<br />What is the role of governance in knowledge creation, dissemination and utilisation?<br />41<br />
    60. 60. Governance and Knowledge<br />General research questions:<br />What models of governance are effective in complex education systems?<br />What knowledge systems are necessary to support this?<br />
    61. 61. Governance and Knowledge<br />More specifically:<br />How creating the capacity at central levels to handle complex flows of knowledge<br />What knowledge options do policy makers have in making decisions and involving stakeholders<br />How to provide the local levels in complex systems with sufficient knowledge to perform<br />How to ensure that levels do share relevant knowledge<br />
    62. 62. Steering <br />Accountability<br />Priority setting<br />Policy Design<br />Implementation<br />Knowledge use<br />Knowledge production<br />Governance model<br />
    63. 63. Governance and Knowledge<br />45<br />
    64. 64. Conclusions and questions<br />F.<br />46<br />
    65. 65. Conclusions and Questions <br />As most policy areas, educational policy has become more evidence-based in most countries<br />What kind of evidence counts in educational policy and practice?<br />What is the role of empirical scientific research?<br />What is the role of comparative education indicators?<br />Is evidence shared and discussed with all stakeholders before leading to policy decisions?<br />Which channels do mediate between research, stakeholders and policy makers?<br />How can we improve knowledge mediation?<br />47<br />
    66. 66. Conclusions and Questions <br />Innovations in education often lack a systemic approach, with a clear and knowledge-driven vision on implementation, scaling, monitoring and evaluation<br />What role do various kinds of knowledge play in educational innovations?<br />Is knowledge resulting from evaluating innovations used in designing new innovations?<br />Is evidence from evaluations used to enrich the dialogue with practitioners and stakeholders?<br />48<br />
    67. 67. Conclusions and Questions <br />Educational systems have become more complex, partly as a result of decentralisation, partly as a result of multiplication of stakeholders. The governance of complex systems becomes a real challenge<br />Can research evidence and other knowledge help in ‘binding’ the educational system? What are the functions knowledge can play in terms of governance?<br />Under which conditions and in what forms should knowledge be developed, shared and discussed in order to have a productive impact on governance?<br />49<br />
    68. 68. Thank you !<br />dirk.vandamme@oecd.org<br />www.oecd.org/edu/ceri<br />50<br />

    ×