Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

2010 C Washington (Future Of Assessments) [No Notes] Rev 1 1

406 views

Published on

The future of educational assessment

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

2010 C Washington (Future Of Assessments) [No Notes] Rev 1 1

  1. 1. The future of AssessmentsLessons learned internationally<br />Washington, 9 March 2010<br />Andreas SchleicherHead, Indicators and Analysis DivisionOECD Directorate for Education<br />
  2. 2. The future of assessments<br />Or the Alchemists’ Stone?<br />The Holy Grail?<br />Know why you are looking<br />You cannot improve what you cannot measure<br />The yardstick for success is no longer just improvement by national standards but the best performing education systems globally<br />Know what you are looking for<br />A new assessment culture<br />Responsive to changing skill requirements<br />Capitalising on methodological advances<br />Not sacrificing validity gains for efficiency gains<br />Know howyou will recognise it when you find it<br />Gauging predictive validity<br />Impact on improving learning and teaching <br />Implications and lessons learned .<br />
  3. 3.
  4. 4. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Cost per student<br />Graduate supply<br />Tertiary-type A graduation rate <br />
  5. 5. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />United States<br />Cost per student<br />Finland<br />Graduate supply<br />Tertiary-type A graduation rate <br />
  6. 6. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Australia<br />Finland<br />United Kingdom<br />Tertiary-type A graduation rate <br />
  7. 7. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Tertiary-type A graduation rate <br />
  8. 8. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Tertiary-type A graduation rate <br />
  9. 9. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Tertiary-type A graduation rate <br />
  10. 10. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Tertiary-type A graduation rate <br />
  11. 11. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />Tertiary-type A graduation rate <br />
  12. 12. A world of change – highereducation<br />Expenditure per student at tertiary level (USD)<br />United States<br />Australia<br />Finland<br />Tertiary-type A graduation rate <br />
  13. 13. Latin America then…<br />Hanushek 2009<br />
  14. 14. Latin America then and now…<br />Hanushek 2009<br />
  15. 15. Latin America then and now…<br />Why quality is the key<br />Hanushek 2009<br />
  16. 16. Know what you are looking for<br />The Holy Grail was a well-described object, and there was only one true grail…<br />
  17. 17. Schooling in the medieval age:<br />The school of the church<br />
  18. 18. Schooling in the industrial age:<br />Uniform learning<br />
  19. 19. Schooling in the industrial age:<br />Uniform learning<br />The challenges today:<br />Universal quality<br />Motivated and self-reliant citizens<br />Risk-taking entrepreneurs, converging and continuously emerging professions tied to globalising contexts and technological advance<br />
  20. 20. How the demand for skills has changedEconomy-wide measures of routine and non-routine task input (US)<br />Mean task input as percentiles of the 1960 task distribution<br />The dilemma of assessments:<br />The skills that are easiest to teach and test are also the ones that are easiest to digitise, automate and outsource<br /> (Levy and Murnane)<br />
  21. 21. Changing skill demands<br />The great collaborators and orchestrators<br />The more complex the globalised world becomes, the more individuals and companies need various forms of co-ordination and management <br />The great synthesisers<br />Conventionally, our approach to problems was breaking them down into manageable bits and pieces, today we create value by synthesising disparate bits together<br />The great explainers<br />The more content we can search and access, the more important the filters and explainers become<br />
  22. 22. Changing skill demands<br />The great versatilists<br />Specialists generally have deep skills and narrow scope, giving them expertise that is recognised by peers but not valued outside their domain<br />Generalists have broad scope but shallow skills<br />Versatilists apply depth of skill to a progressively widening scope of situations and experiences, gaining new competencies, building relationships, and assuming new roles. <br />They are capable not only of constantly adapting but also of constantly learning and growing<br />The great personalisers<br />A revival of interpersonal skills, skills that have atrhophied to some degree because of the industrial age and the Internet<br />The great localisers<br />Localising the global<br />
  23. 23. Education today needs to prepare students…<br />… to deal with more rapid change than ever before…<br />… for jobs that have not yet been created…<br />… using technologies that have not yet been invented…<br />… to solve problems that we don’t yet know will arise <br />It’s about new…<br />Ways of thinking<br />involving creativity, critical thinking, problem-solving and decision-making<br />Ways of working<br />including communication and collaboration<br />Tools for working<br />including the capacity to recognise and exploit the potential of new technologies<br />The capacity to live in a multi-faceted world as active and responsible citizens. <br />
  24. 24. Mathematics in PISA<br />The real world<br />The mathematical World<br />Making the problem amenable to mathematical treatment<br />A mathematical model<br /> A model of reality<br />Understanding, structuring and simplifying the situation<br />Using relevant mathematical tools to solve the problem<br />A real situation<br />Validating the results<br />Mathematical results<br />Real results<br />Interpreting the mathematical results<br />
  25. 25. By whom? <br /><ul><li>Evaluators
  26. 26. Users of feedback
  27. 27. Evaluation agencies</li></ul>National assessment systems differ…<br />Student assessment<br />Classroom<br />How? Methods and procedures, mix of criteria and instruments<br /><ul><li>Mapping of feedback to different units</li></ul>For what? E.g.<br /><ul><li>Accountability
  28. 28. Improvement</li></ul>Teacher appraisal<br />Who is assessed<br />School<br />School evaluation<br />System<br />System assessment<br />What?<br /><ul><li>Inputs
  29. 29. Processes
  30. 30. Outcomes</li></ul>With whom? <br /><ul><li>Agents involved</li></li></ul><li>Assessment culturesPutting the pressure on top of the education system is the easy part, building capacity is harder<br />Participative/internal <br />Interactive, reflective, critical friend<br />Survey<br />Summative<br />Formative<br />Standardised assessment<br />Inspectorate<br />Administrative external <br />
  31. 31. Participative / internal<br />Formative classroom-based assessments(e.g. Europe, Asia)<br />Efficiency gains<br />Validity gains<br />Large scale and high-stakes summative assessments, typically multiple-choice to contain costs(US, England, Latin America…)<br />Large scale and low-stakes assessments, sample-based administration allows for complex task types (e.g. Northern Europe, Scotland, PISA)<br />Administrative / external<br />
  32. 32. …but there are global trends<br /><ul><li>Understanding what the assessment reveals about students’ thinking to shape better opportunities for student learning
  33. 33. Responding to assessments can enhance student learning if tasks are well crafted to incorporate principles of learning
  34. 34. Capitalise on improved data handling tools and technology connectivity to combine formative and summative assessment interpretations for a more complete picture of student learning </li></ul>Multi-layered, coherent assessment systems from classrooms to schools to regional to national to international levels that…<br />Support improvement of learning at all levels of the education system<br />Are largely performance-based<br />Make students’ thinking visible and allow for divergent thinking <br />Are adaptable and responsive to new developments<br />Add value for teaching and learning by providing information that can be acted on by students, teachers, and administrators <br />Are part of a comprehensive and well-aligned continuum, communicate what is expected and hold relevant stakeholders accountable .<br /><ul><li>Integrate, synthesize and creatively apply content knowledge in novel situations
  35. 35. Activate students as owners of their own learning and activate students as learning resources for one another</li></li></ul><li>Know how you will recognise it when you find it<br />The Alchemists’ stone was to be recognisedby transforming ordinary metal into gold…<br />
  36. 36. Increased likelihood of postsec. particip. at age 19/21 associated with PISA reading proficiency at age 15 (Canada)after accounting for school engagement, gender, mother tongue, place of residence, parental, education and family income (reference group PISA Level 1)<br />Odds ratioCollege entry<br />School marks at age 15<br />PISA performance at age 15<br />
  37. 37. Relationship between test performance and economic outcomesAnnual improved GDP from raising performance by 25 PISA points<br />Percent addition to GDP<br />
  38. 38. Increase average performance by 25 PISA points (Total 115 trillion $)<br />bn$<br />
  39. 39. Implications and lessons learned<br />The medieval Alchemists’ followed the dictates of a well-established science but that was built on wrong foundations<br />The search for the Holy Grail was overburdened by false clues and cryptic symbols<br />
  40. 40. From assessment-inhibited practice towards outcome driven reform<br />Strong focus on processes<br />Integrated quality management<br />Good willand trust<br />Weak outcome-based management<br />Strong outcome-based management<br />External control, uninformed prescription<br />Deprivation<br />Weak focus on processes<br />
  41. 41. Some criteria used in the world<br />Coherence<br />Built on a well-structured conceptual base—an expected learning progression—as the foundation both for large scale and classroom assessments <br />Consistency and complementarity across administrative levels of the system and across grades<br />Comprehensiveness <br />Using a range of assessment methods to ensure adequate measurement of intended constructs and measures of different grain size to serve different decision-making needs <br />Provide productive feedback, at appropriate levels of detail, to fuel accountability and improvement decisions at multiple levels<br />Continuity <br />A continuous stream of evidence that tracks the progress of both individual students .<br />
  42. 42. Designing assessments<br />Assessment frameworks<br />A working definition of the domain and its underlying assumptions<br />Organising the domain and identifying key task characteristics that guide task construction<br />Operationalising task characteristics in terms of variables<br />Validating the variables and assessing the contribution they each make to understanding task difficulty<br />Establishing an interpretative scheme .<br />
  43. 43. Understanding learningprogressions<br />Learning targets<br />Defining what mastery means for a given skill level<br />Progress variables<br />Delineate a pathway that characterise the steps that learners typically follow as they become more proficient<br />Evaluation of students reasoning in terms of the correctness of their solutions as well as in terms of their complexity, validity and precision<br />Levels of achievement<br />Describing the breadth and depth of the learner’s understanding of the domain at a particular level of advancement<br />Learning performances <br />The operational definitions of what student’s understanding would look like at each of the stages of progress .<br />Wilson, ATC21S<br />
  44. 44. Interest science<br />Indicate curiosity in science and science-related issues and endeavours<br />Demonstrate willingness to acquire additional scientific knowledge and skills, using variety of resources and methods<br />Demonstrate willingness to seek information and have an interest in science, including consideration of science-related careers <br />Support for science<br />Acknowledge the importance of considering different scientific perspectives and arguments<br />Support the use of factual information and rational explanation<br />Logical and careful processes in drawing conclusions <br />Knowledge of science<br />Physical systems (structure of matter, properties of matter, chemical changes of matter, motions and forces, energy and its transformations, energy and matter)<br />Living systems (cells, humans, populations, ecosystems, biosphere)<br />Earth and space (structures of the earth system, energy in the earth system, change in the earth system, earth’s history, space)<br />Technology systems (Concepts and principles, science and technology)<br />Knowledge about science<br />Scientific enquiry (purpose, experiments, data, measurement, characteristics of results)<br />Scientific explanations (types, rules, outcomes)<br />Identifying<br />Recognising issues that can be investigated scientifically<br />Identifying keywords in a scientific investigation<br />Recognising the key features of a scientific investigation<br />Explaining<br />Applying knowledge of science in a situation<br />Describing or interpreting phenomena scientifically or predicting change<br />Using evidence<br />Interpreting scientific evidence and drawing conclusions<br />Identifying the assumptions, evidence and reasoning behind conclusions<br />Context<br />- Personal<br /><ul><li> Social/public
  45. 45. Global</li></ul> Competencies<br /><ul><li>Identify scientific issues
  46. 46. Explain phenomena scientifically
  47. 47. Use scientific evidence</li></ul>Knowledge<br /><ul><li>Knowledge of science
  48. 48. Knowledge about science</li></ul>Attitudes<br />-Interest in science<br />-Support for scientific enquiry<br />-Responsibility<br />
  49. 49. OECD Level 2<br />OECD Level 6<br />Identifying<br />Recognising issues that can be investigated scientifically<br />Identifying keywords in a scientific investigation<br />Recognising the key features of a scientific investigation<br />Explaining<br />Applying knowledge of science in a situation<br />Describing or interpreting phenomena scientifically or predicting change<br />Using evidence<br />Interpreting scientific evidence and drawing conclusions<br />Identifying the assumptions, evidence and reasoning behind conclusions<br />Students can determine if<br />scientific measurement can be applied to a given variable in an investigation. Students can appreciate the relationship between a simple model and the phenomenon it is modelling. <br />Students can demonstrate ability to understand and articulate the complex modelling inherent in the design of an investigation.<br />Context<br />- Personal<br /><ul><li> Social/public
  50. 50. Global</li></ul> Competencies<br /><ul><li>Identify scientific issues
  51. 51. Explain phenomena scientifically
  52. 52. Use scientific evidence</li></ul>Students can recall an<br />appropriate, tangible, scientific fact applicable in a simple and straightforward context and can use it to explain or predict an outcome.<br />Students can draw on<br />a range of abstract scientific knowledge and concepts and<br />the relationships between these in developing explanations of<br />processes<br />Knowledge<br /><ul><li>Knowledge of science
  53. 53. Knowledge about science</li></ul>Attitudes<br />-Interest in science<br />-Support for scientific enquiry<br />-Responsibility<br />Students demonstrate<br />ability to compare and differentiate among competing explanations by<br />examining supporting evidence. They can formulate arguments by synthesising evidence from multiple<br />sources.<br />Students can point to an obvious feature in a simple table in support of a given statement. They are able to recognise if a set of given characteristics apply to the function of everyday<br />artifacts.<br />
  54. 54. Some methodological challenges<br />Can we sufficiently distinguish the role of context from that of the underlying cognitive construct ?<br />Do new types of items that are enabled by computers and networks change the constructs that are being measured ?<br />Can we drink from the firehose of increasing data streams that arise from new assessment modes ?<br />Can we utilise new technologies and new ways of thinking of assessments to gain more information from the classroom without overwhelming the classroom with more assessments ?<br />What is the right mix of crowd wisdom and traditional validity information ?<br />How can we create assessments that are activators of students’ own learning ?<br />Wilson, ATC21S<br />
  55. 55. High policy value<br />A real-time assessment environment that bridges the gap between formative and summative assessment .<br />Quick wins<br />Must haves<br />Examine individual, institutional and systemic factors associated with performance<br />Extending the range of competencies through which quality is assessed<br />Monitor educational progress<br />Measuring growth in learning<br />Low feasibility<br />High feasibility<br />Establish the relative standing of students and schools<br />Assuming that every new skill domain is orthogonal to all others<br />Money pits<br />Low-hanging fruits<br />Low policy value<br />
  56. 56. Getting the sequencingright<br />Phases of development<br />Adequate  Good<br />Poor  Adequate<br />Good  Great<br />Main focus of assessment<br /><ul><li>Tackling underperformance
  57. 57. Transparency .
  58. 58. Spreading best practice
  59. 59. World class performance.
  60. 60. Continuous learning and innovation .</li></ul>Role of government<br /><ul><li>Regulating .
  61. 61. Capacity-building
  62. 62. Prescribing .
  63. 63. Justifying
  64. 64. Enabling
  65. 65. Incentivising .</li></ul>Role of professions<br /><ul><li>Accommodating
  66. 66. Evidence-based
  67. 67. Adopting best . practice
  68. 68. Implementing
  69. 69. Accepting evidence
  70. 70. Adopting minimum standards
  71. 71. Leading
  72. 72. Evidence-driven
  73. 73. Achieving high reliability and innovation .
  74. 74. Principled
  75. 75. Strategic partnership
  76. 76. Negotiated
  77. 77. Pragmatic .
  78. 78. Top-down
  79. 79. Antagonistic .</li></ul>Nature of relationship between government and professions<br /><ul><li>Steady improvement
  80. 80. Growing public satisfaction .
  81. 81. Consistent quality
  82. 82. Public engagement and co-production .
  83. 83. Improvement in outcomes
  84. 84. Reduction of public anxiety.</li></ul>Main outcomes<br />
  85. 85. www.oecd.org; www.pisa.oecd.org<br />All national and international publications<br />The complete micro-level database<br />email: Andreas.Schleicher@OECD.org<br />Twitter: @SchleicherEDU<br />… and remember:<br /> Without data, you are just another person with an opinion<br />Thank you !<br />

×