Tools To Assess The Quality Of The Curriculum

25,945 views

Published on

How can we assess the quality of the documented curriculum, the enacted curriculum, the assessed curriculum, and the impact of the curriculum on students? From data analysis, to looking at student work, to power standards, to calibration, to professional learning communities, these tools help us to assess the curriculum.

Published in: Education, Technology
2 Comments
5 Likes
Statistics
Notes
No Downloads
Views
Total views
25,945
On SlideShare
0
From Embeds
0
Number of Embeds
45
Actions
Shares
0
Downloads
473
Comments
2
Likes
5
Embeds 0
No embeds

No notes for slide

Tools To Assess The Quality Of The Curriculum

  1. 1. Tools to Assess the Quality of the Curriculum, I, A, IS<br />The Parable of the Low Hanging Fruit<br />
  2. 2. In the beginning, all that was necessary to have effective curriculum was to hand a teacher the right text.<br />
  3. 3. And because only national standardized tests (which had little relationship to the taught curriculum no matter the text), who really knew what effective curriculum, teaching, or learning looked like? It looked like this:<br />
  4. 4. But one day, fearing that the US would lose its status as a world power because of its education system (A Nation at Risk), ED Reform began the first movement in American Education that has ever lasted more than 7 years.<br />100<br />95.1<br />100<br />90.2<br />90<br />92.2<br />85.4<br />80.5<br />84.3<br />80<br />ELA<br />75.6<br />Composite Performance Index (CPI)<br />76.5<br />70.7<br />70<br />68.7<br />60<br />60.8<br />Math<br />ELA<br />Math<br />53.0<br />50<br />Ed. Reform<br />2001 & 02<br />2003 & 04<br />2005 & 06<br />2007 & 08<br />2009 & 10<br />2011 & 12<br />2013 & 14<br />Ten years ago, only 24 % of the state’s 10th graders scored proficient or higher on the math MCAS exam. <br />
  5. 5. And Since Ed. Reform began, we have developed many ways beyond buying a new textbook to fine-tune C, I, A, and IS. Many more are sure to follow. We truly are building the plane while flying it!<br />
  6. 6. How do we know if our curriculum is working?<br />Processes<br />LASW<br />Calibration of standards (NEC Mentor)<br />Rubric calibration<br />Observation/Evaluation of teaching<br />Vertical Teaming<br />Power Standards<br />UBD curriculum development/rubrics for evaluation<br />Professional Learning Community<br />Critical Friends<br />Data analysis:<br />MCAS, AP, SAT analyses (Root Cause)<br />Local Assessments<br />Surveys<br />Interviews<br />Parent Comments<br />Student Work Analysis<br />
  7. 7. The Change in Assessment from Education Reform<br />Old model<br />New Model<br />Go on<br />
  8. 8. Tools/Use<br />c<br />I<br />A<br />IS<br />
  9. 9. What Works? (Marzano)(They are in rank order)<br />In classrooms<br />In schools and districts<br />Guaranteed (taught curriculum) and viable curriculum<br /><ul><li>Opportunity to learn
  10. 10. Time
  11. 11. Guaranteed—and assessed throughout the year
  12. 12. Viable—challenging rigorous (not onerous)</li></ul>Challenging Goals and Effective Feedback<br /><ul><li>Monitoring (timely feedback, formative, not summative, assessment)
  13. 13. Pressure to achieve</li></ul>Parental involvement <br /><ul><li>Good communication is critical component)</li></ul>Safe and Orderly Environment<br /><ul><li>School Climate
  14. 14. Positive reinforcement
  15. 15. Productive climate and culture conducive to learning</li></ul>Collegiality <br /><ul><li>Authentic professional interactions, and professionalism , content knowledge, and high correlation with pedagogy
  16. 16. Leadership
  17. 17. Learning organization
  18. 18. Cooperation</li></ul>Effective Instructional Strategies:<br /><ul><li>Flexible grouping, planning, setting goals
  19. 19. Interactive learning, ongoing feedback, personalization
  20. 20. Identifying similarities and differences, summarizing and note taking, reinforcing effort and providing recognition, homework and practice, graphic organizers, cooperative learning, setting objectives and providing feedback, generating and testing hypotheses, questions, cues, and advance organizers
  21. 21. Madeline Hunter: Anticipatory set, objective and purpose, input, modeling, checking for understanding, guided practice, independent practice</li></ul>Classroom management (discipline, student socialization, teacher behavior, organization, interactions, equity)<br />Classroom curriculum design (curriculum assessment)<br />
  22. 22. It’s the teacher.<br /> <br /><ul><li>Most classrooms have mediocre teachers—Elkind, Goodlad, Sizer, Wagner
  23. 23. WRITING AND MATH: only 32% of college-bound students are adequately prepared for college, and 58% are in remedial courses—College Knowledge</li></ul> <br /><ul><li>READING: 34% of college graduates can read a complex book and extrapolate from it. NCED Statistic
  24. 24. WRITING: 24% of students write at proficient level; 4% at Advanced-NAEP</li></ul>The Research<br />
  25. 25. A meta-analysis of effectiveness based on 35 years of educational research <br /> Effective schools 72.4 % of students pass test<br /> Ineffective schools 27.6% of students pass test<br /> <br /><ul><li>Teachers : Decisions made on a teacher level have a far greater impact than decisions made at the school level.
  26. 26. The least effective teacher showed gains of 14% in student achievement in one year.
  27. 27. Ineffective strategies: use lower order questions based on recall, teachers talk (lecture, teacher-centered class) instead of providing information in a variety of formats, imprecise feedback on tests (grades)—No clear idea of the essential concepts and the scaffolding necessary to get ALL students there.
  28. 28. The most effective had gains of 53% in one year.
  29. 29. The gain from an average teacher is 34%</li></ul>.<br /><ul><li>The cumulative effect over 3 years: for the most effective teacher is a gain of 83% point gain; for the least effective teacher the gain is 29%.</li></ul> <br />The Research: What Works in Schools: Translating Research into Action, Robert J. Marzano.<br />
  30. 30. Research:What Works at the SCHOOL level?<br />
  31. 31. What Works for the TEACHER:<br />discipline, student socialization, teacher behavior, organization, interactions, equity: routines, classroom climate <br />Standards-based curriculum: backwards plan<br />Goal setting, measuring progress<br />
  32. 32. <ul><li>Rigor—Higher order thinking skills in questioning, tests, quizzes
  33. 33. Student engagement (not teacher-centered)
  34. 34. Writing
  35. 35. High expectations for all students
  36. 36. “Front loaded” units—students know what the final product looks like (exemplars) and how they will be graded from the first day of the unit (rubric).
  37. 37. Gradual release of responsibility
  38. 38. Student self-assessment
  39. 39. Good Feedback</li></ul>What to look for in evaluations and walk-throughs<br />
  40. 40.
  41. 41. Benchmarking the MCAS and Standardized TestsCALIBRATION<br />Grade 3 MAT at 77th Percentile = Proficient Grade 4 ELA MCAS<br />Grade 6 MAT at 49th Percentile = Proficient at Grade 6 ELA MCAS <br />Grade 7 MAT at 56th Percentile = Proficient Grade 10 ELA MCAS<br /> <br />Grade 3 MAT at 84th Percentile = Proficient Grade 4 math MCAS<br />Grade 5 MAT at 69th Percentile = Proficient Grade 5 math MCAS<br />Grade 7 MAT at 72nd Percentile = Proficient Grade 8 math MCAS<br />Grade 7 MAT at 67th Percentile = Proficient Grade 10 math MCAS<br />
  42. 42. Drilling Down with Data<br />The PIM Process<br />
  43. 43. Pim: http://www.doe.mass.edu/sdi/pim/<br />DATA<br />
  44. 44. PIM<br />GUIDING QUESTION: WHY HAVEN’T STUDENTS IN THE TARGETED GROUP LEARNED THE SKILLS AND KNOWLEDGE DESCRIBED IN THE STUDENT LEARNING OBJECTIVES?<br />
  45. 45.
  46. 46. TestWiz<br />
  47. 47. Looking at Student WorkCalibrationRubricsLow, medium, and high protocolTeaching to the rubric<br />
  48. 48. Can you predict how your students will do on the MCAS based on their class work, your tests, your textbook assessments?<br />
  49. 49. Why LASW?<br />Common expectations for writing (and reading)<br />Calibrate to MCAS (at least)<br />Common language<br />Consistent experience for students<br />Collaborative lesson planning<br />Action plans for three levels of learners<br />
  50. 50. DATA: What do you ask of it?<br />
  51. 51. Score Analysis: Score Point 2 (AVERAGE SCORE)This response demonstrates a fair understanding of the mathematical concepts involving integers that underlie the task by completing 3 of the 6 elements. <br />An incorrect number line is provided which shows the negative integers placed to the right of zero, and the positive integers placed to the left of zero. <br />The explanation is unacceptable because it does not demonstrate an understanding that negative integers are placed to the left of zero on the number line: because everything on the right side of the 0 is - what ever number and it just goes like you would count from 0 - when it stops say to -20°. <br />The response correctly indicates that +3 is the greater number and provides an acceptable explanation: cause 10 is below zero and +3 is above zero. <br />The response correctly indicates that -3 is the greater number. However, the explanation is circular and does not demonstrate an understanding of negative integers because it is based on the incorrect number line provided in part (a): because it is on - and if you look at the line above then the -3 is higher then the -10. <br />
  52. 52. START HERE<br />You are here<br />
  53. 53. Grade 4 ORQ<br />Based on the article, describe the challenges Annie Smith Peck faced throughout her life. Support your answer with important details from the article.<br />
  54. 54.
  55. 55.
  56. 56. What is good feedback?<br /><ul><li>Focuses on goal
  57. 57. Is clear and positive
  58. 58. Identifies specific strengths
  59. 59. Points to areas needing improvement
  60. 60. Suggests a route of action student can take
  61. 61. Limits amount of feedback to what the learner can accomplish
  62. 62. Models how students can self assess
  63. 63. Gives models, rubrics
  64. 64. Is timely
  65. 65. For example: Hamburger model/6-trait rubric</li></li></ul><li>What this rubric says to a student is: This is what you are <br />doing now, and this is what you can do to improve.<br />
  66. 66. Recognize complexity and look at a variety of assessments and the effectiveness of possible interventions while limiting focus to three students<br />Commonalities among three tiers<br />Addresses high achievers’ needs. They are often ignored<br />The Case Study Method<br />
  67. 67. Case Study Interventions<br />
  68. 68. MCAS Long Essay<br />
  69. 69. MCAS Long Essay<br />
  70. 70. Setting Clear (and Common) Expectations<br />
  71. 71. Needs Improvement: 3 (of 6 for content/organization) essay for grade 7 ELA. This is what the average score for 7th grade looks like.<br />NEXT STEPS? Root Cause?<br />
  72. 72. Consider three levels of response<br />
  73. 73. Math Rubric<br />Scoring Guide : Students&apos; Heights Rubric Score point 4: The response shows a comprehensive understanding of stem-and-leaf plots and how to interpret and draw conclusions from them. Score point 3: The response shows a general understanding of stem-and-leaf plots and how to interpret and draw conclusions from them. Score point 2: The response shows a basic understanding of stem-and-leaf plots. Score point 1: The response shows a minimal understanding of stem-and-leaf plots. Score point 0: The response is incorrect or contains some correct work that is irrelevant to the skill or concept being measured.<br />
  74. 74. NCS Mentorhttp://www.ncsmentor.com/default.htm<br />Score Analysis: Score Point 2The response demonstrates a basic understanding of a stem-and-leaf plot and how to interpret and draw conclusions from them by completing 3 of the 6 elements. <br />The response does not correctly identify 147 cm as the mode of the students&apos; heights. Instead, a flawed strategy which averages the heights results in an incorrect mode: you just add all the students&apos; heights up and divide that number with the number of heights recorded. <br />An incorrect stem-and-leaf plot is given which includes all of the students&apos; heights but not in the correct format and, therefore, receives no credit. <br />The response correctly identifies 142 as the median height of the students. The explanation demonstrates a correct strategy for finding the median: to find the median you put the numbers in order and then the number that is in the middle is the median. <br />The response provides the correct conclusion about the heights of the two additional students: One of the new student&apos;s height had to be less than 142 centimeters and the other new student&apos;s height had to be more than 142. However, there is no attempt to explain how this conclusion was drawn or to provide a specific example. <br />Successfully completing 3 of the 6 elements earns this response 2 points. <br />
  75. 75. 1. Begin with the end in mind2. Set measurable goals<br />
  76. 76. Understanding by Design<br />Standards-based teaching<br />Clear goals <br />Assessments matched to goals<br />Activities are the LAST part of the work.<br />
  77. 77. Content priorities<br />Worth being<br />familiar with<br />Discussions<br />Quizzes, <br />formative assessments<br />homework<br />Important to know<br /> and do<br />Big Ideas<br />Understandings<br />Major performance assessment or <br />Final unit exam <br />
  78. 78. Stage 1 – Desired results<br />Stage 1 – Desired Results<br />Content Standard (s):<br />Provide a framework for curriculum design; generalizations that define parameters about what students are expected to know and be able to do<br />Essential Question (s):<br />Inquiry used to explore the generalization to enable students to earn the understanding<br />Understanding (s):<br />Students will understand that…<br />Insight into the generalization; what students will walk away with<br />Knowledge:<br />Student will know … Skills:Students will be able to … <br />Specific priorities about what students are expected to know and be able to do<br />Design Standard for ENDURING UNDERSTANDINGS<br />45<br />
  79. 79. Three stages of backward design<br />1. Identify desired results<br />2. Determine acceptable evidence<br />Then, and only then<br />3. Plan learning experiences &<br />instruction<br />46<br />
  80. 80. Stage 2 – Assessment EvidencePerformance Task (s)Other Evidence<br />Varied types, over time:<br /><ul><li>authentic tasks and projects
  81. 81. academic exam questions, prompts, and problems
  82. 82. quizzes and test items
  83. 83. informal checks for understanding
  84. 84. student self-assessments</li></li></ul><li>Establishing Curricular Priorities<br />Assessment Types<br />Traditional<br />Quizzes & tests<br /><ul><li>Paper/pencil
  85. 85. Selected response
  86. 86. Constructed-response</li></ul>Performance Tasks and Projects<br /><ul><li>Open-ended
  87. 87. Complex
  88. 88. authentic</li></ul>worth being familiar with<br />important to know & do<br />‘big ideas’ worth understanding<br />48<br />
  89. 89. Reliability:Snapshot vs. photo album<br />We need patterns that overcome inherent measurement error<br />Sound assessment (particularly of State Standards) requires multiple evidence over time – a photo album vs. a single snapshot<br />Should a teenager get their drivers license with just a written or just a performance assessment? <br />
  90. 90.
  91. 91. FACETS OF UNDERSTANDING:<br />Which of the following 6 facets to you expect students to do in this unit to demonstrate their understanding?<br />Explanation<br />Interpretation<br />Application<br />Perspective<br />Empathy<br />Self-knowledge<br />
  92. 92.
  93. 93. The complexity of curriculum improvement is growing yearly.<br />What Works?<br />
  94. 94. Professional Learning Communities<br />
  95. 95. “Every educator engages<br />in effective professional learning every day so<br />every student achieves”<br /><ul><li>Skills: Measuring Progress, Focus on Students First and on Results
  96. 96. Rigor: Higher Order Thinking Skills
  97. 97. Collaboration: Purposeful Co-labor-ing
  98. 98. Positive school culture
  99. 99. A resolution for continuous improvement
  100. 100. The Bottom Line: Students’ achievement in the district and their readiness for their future</li></li></ul><li>SEVEN STRATEGIES FOR IMPROVING CURRICULUM, INSTRUCTION, and ASSESSMENT<br />An urgency and understanding of the problem presented through data<br />A shared vision of good teaching which includes rigor, relevance, and respect<br />Adult meetings that focus on instruction and model good teaching<br />Clear standards, assessments, and consistent understanding of quality student work<br />Supervision that is frequent, rigorous, and focused on instruction<br />PD that is primarily on-site, intensive, collaborative, and job-embedded<br />Diagnostic data that is used frequently by teams to assess learning and teaching<br />
  101. 101. The BLACK BOX:What do we need to do to unpack the needs and potential of the classroom?<br />
  102. 102. The DISCONNECT BETWEEN STANDARDS and THE CLASSROOM<br /><ul><li>Firm evidence shows that formative assessment is an essential component of classroom work and that its development can raise standards of achievement.
  103. 103. The main problem is that pupils can assess themselves only when they have a sufficiently clear picture of the targets that their learning is meant to attain. Surprisingly, and sadly, many pupils do not have such a picture, and they appear to have become accustomed to receiving classroom teaching as an arbitrary sequence of exercises with no overarching rationale.
  104. 104. A particular feature of the talk between teacher and pupils is the asking of questions by the teacher. This natural and direct way of checking on learning is often unproductive.
  105. 105. Excerpted from “Inside the Black Box”</li></li></ul><li>Changes needed “Inside the Black Box”<br />Traditional classroom<br /> practices <br />High expectations<br />Student self-efficacy is enhanced with good feedback: <br />from <br />LUCK<br />TASK DIFFICULTY ABILITY <br />to<br />HARD WORK<br />Formative assessment works significantly and with low achievers.<br />The bell curve<br />predicted and <br />expected failure<br />Attitudes<br />Grades versus standards<br />Learning must be interactive. <br />(The social construction of knowledge)<br />Grades rank, but don’t inform<br />The quality of feedback needs to be enhanced.<br />Student self assessment: rubrics and examples before unit<br />Questioning and convergent thinking versus HOTS<br />Calibration to standards<br />
  106. 106.
  107. 107. <ul><li>CALIBRATION: Baseline data, developing correlation to MCAS.
  108. 108. CONNECTING: Daily classroom plans to benchmarks to MCAS
  109. 109. Collaborative assessment of student work, common lessons to improve, continued assessment—The next step for high achievers, average students, and students who have greatest needs.
  110. 110. Open Response practice across the disciplines. One per unit exam, commonly chosen and commonly assessed
  111. 111. Commonly assessed ORQs by grade levels and departments
  112. 112. Common assessment of long essay in grades 4, 7 and 9 and 10.
  113. 113. Increased time
  114. 114. Targeted teaching
  115. 115. Common exams—finals, mid-terms, unit—over time
  116. 116. Common syllabi for teachers with the same course
  117. 117. Exams match benchmarks, student expectations</li></ul>Possible Action Responses<br />
  118. 118. Action Planning<br /> <br />SMART Goal is Specific, Measurable, Attainable, Reasonable, Time-Specific)<br />GOAL: To increase fifth grade low income and SPED math scores by 10 CPI points in the 2009 MCAS. To increase average scores by 10% each quarter.<br />Objectives:<br /><ul><li>To provide increased time and targeted instruction for students who received warning scores in math in grade 3, 4, and 5 in 2007.
  119. 119. To provide 2 or three additional periods of math weekly to these students
  120. 120. To develop a specific curriculum for general weaknesses (ORQ, SA, fractions) and targeted individualized contracts for specific student needs.
  121. 121. To assess student progress every two weeks on general weaknesses and the specific weaknesses of each student
  122. 122. To purchase Study Island software for all students to allow for supplementary practice at home and at school.</li></ul> <br /><ul><li>Rationale: (SWOT analysis: Strengths, Weaknesses, Opportunities, and Threats) What will help or hinder the plan? Consider responses to change: resistance, CBAM responses, and the counter-intuitiveness of change: we go slowly to go fast, instead of anger, embrace opposition, etc. 
  123. 123. Strength: The district has been declared in need of improvement based on the underperformance of two subgroups. Strengths: The district has an aligned curriculum with clear benchmarks. The district uses TestWiz to analyze subgroup needs to provide targeted instruction. The district develops an ISSP for each student who has received a Warning in MCAS. There is some homeroom time and x-block time to provide targeted instruction to specific students.
  124. 124. Weakness: The district does not have time, much money or staff to add more instructional time.
  125. 125. Opportunities: Grant money is available, but only $3000. Software can provide support and differentiation.
  126. 126. Threats: Students do not like to attend before and after school sessions. </li></li></ul><li>Sample Action Plan<br />

×