Successfully reported this slideshow.

Data Summer


Published on

TechTuneUP 2008, monday presentation

Published in: Technology, Education
  • Be the first to comment

Data Summer

  1. 1. Data Guided Instruction
  2. 2. Why Data? <ul><li>We assess to see if our students are learning. </li></ul><ul><li>Data gathering and assessment involves going beyond the grade or score a student receives on a test. </li></ul><ul><li>Instead it seeks to help a teacher and student understand better what was learned and what comes next. </li></ul>
  3. 3. What are your primary job responsibilities? <ul><li>To teach the content? </li></ul><ul><li>To ensure that students learn? </li></ul><ul><li>What is the difference between the two? </li></ul>
  4. 4. 21 st Century Skills
  5. 5. Don’t confuse data with NCLB! <ul><li>Data gathering/analysis is an important part of NCLB </li></ul><ul><li>You might not like NCLB but don’t hate data gathering/analysis because of NCLB </li></ul><ul><li>Data is data…we need it to see what is happening in our classrooms. </li></ul><ul><li>We need to make use of this data to help us get better. </li></ul>
  6. 6. Formative Assessment <ul><li>Black and Wiliam published a groundbreaking article in 1998 called “ Inside the Blackbox: Raising Standards through Classroom Assessment ”. </li></ul>Classroom (the black box) Systems Engineering Model Students, Teachers Requirements, Standards, Resources $$$, Curriculum Testing, Parent Needs, etc. INPUTS Competent students Satisfied teachers High test results OUTPUTS How Politicians See School
  7. 7. Black & Wiliam <ul><li>Synthesized 250 studies from all over the world. (yes, other countries know something about teaching and learning) </li></ul><ul><li>Found that using formative assessment as a way to promote learning had a positive impact on student learning. </li></ul><ul><li>The effect size (amount of growth) was .4 to .7 </li></ul><ul><li>Further research showed the impact was even greater in low achieving students. </li></ul>
  8. 8. Amount of Growth <ul><li>.4 effect size would be the equivalent of an average student moving to the top 35% of a group not involved in the innovation. </li></ul><ul><li>A .7 effect size would move the US in a recent worldwide math test from the middle of the 40 countries to the top 5. </li></ul>
  9. 9. Formative vs. Summative <ul><li>Formative is assessment along the way… </li></ul><ul><li>Summative is an end of year, end of semester, end of quarter, end of unit test. </li></ul><ul><li>Formative assessment comes before the summative assessment and should help the student perform better on the summative assessment as a result of classroom instructional practices. </li></ul>
  10. 10. Black & Wiliam findings cont. <ul><li>The research indicates that improving learning through assessment depends on five, deceptively simple, key factors: </li></ul><ul><li>the provision of effective feedback to pupils; </li></ul><ul><li>the active involvement of pupils in their own learning; </li></ul>
  11. 11. Black & Wiliam findings cont. <ul><li>adjusting teaching to take account of the results of assessment; </li></ul><ul><li>a recognition of the profound influence assessment has on the motivation and self-esteem of pupils, both of which are crucial influences on learning; </li></ul><ul><li>the need for pupils to be able to assess themselves and understand how to improve. </li></ul>
  12. 12. Providing effective feedback to pupils <ul><li>What is effective feedback? </li></ul>
  13. 13. Active involvement of pupils in their own learning <ul><li>How can we involve students in their own learning? </li></ul>
  14. 14. Adjusting teaching to take account of the results of assessment <ul><li>This goes back to our instructional cycle! </li></ul><ul><li>What do we do with the assessment data? </li></ul>
  15. 15. Recognition of the influence assessment has on motivation and self-esteem of pupils, both of which are crucial influences on learning <ul><li>How can we use assessment to make students feel successful? </li></ul><ul><li>How can we use assessment to motivate students? </li></ul><ul><li>How can assessment damage a student? </li></ul>
  16. 16. Students need to be able to assess themselves and understand how to improve <ul><li>How do we as teachers make this possible for our learners? </li></ul>
  17. 17. Just 5 things...we can do that, right?!
  18. 18. <ul><li>At the same time, several inhibiting factors were identified. Among these are: </li></ul><ul><li>teachers tend to assess quantity of work and presentation rather than the quality of learning; </li></ul><ul><li>greater attention given to grading, much of it tending to lower the self-esteem of pupils, rather than to providing advice for improvement; </li></ul>
  19. 19. Inhibiting factors cont. <ul><li>comparing students with each other which demoralizes the less successful learners; </li></ul><ul><li>teachers’ feedback to students often serves social and managerial purposes rather than helping them to learn more effectively; </li></ul><ul><li>teachers not knowing enough about their students’ learning needs. </li></ul>
  20. 20. Teachers tend to assess quantity of work and presentation rather than the quality of learning <ul><li>We grade work…what is the grade based on? </li></ul><ul><li>Doing it, not doing it? </li></ul><ul><li>Quality of learning? </li></ul><ul><li>Are all of our assignments (grades) standards aligned? </li></ul><ul><li>UBD: Is the assignment related to the end result we seek? </li></ul><ul><li>Crayola curriculum? </li></ul>
  21. 21. Grading practices tend to lower self-esteem of pupils, rather than provide advice for improvement <ul><li>Do you grade students the way you were graded when you went to school? </li></ul><ul><li>Do students have a chance to improve upon a grade? </li></ul><ul><li>Does our grading allow student growth? </li></ul>
  22. 22. We use assessment to compare pupils with each other which demoralizes the less successful learners <ul><li>What is the first thing a student does when you pass back a test? </li></ul><ul><li>How can we get students to measure their learning not based on what others have done, but what the student has done themselves? </li></ul>
  23. 23. Teachers’ feedback to pupils often serves social and managerial purposes rather than helping them to learn more effectively <ul><li>What does feedback look like in your classroom? </li></ul><ul><li>How could you make it more specific and meaningful? </li></ul><ul><li>What if we assign less “gradable” work and increase the quality of the feedback </li></ul>
  24. 24. Teachers not knowing enough about their pupils’ learning needs. <ul><li>What do you know about your students’ learning needs? </li></ul><ul><li>How do you keep track of those needs? </li></ul><ul><li>Do those needs influence your interactions with them? </li></ul>
  25. 25. Frequent In Depth Data Use: What Kind? How Much? Formative = Diagnostic <ul><li>Frequent enough to show growth </li></ul><ul><li>Focused practice between measures </li></ul>
  26. 26. Assessment that promotes learning… <ul><li>• is embedded in a view of teaching and learning of which it is an essential part; </li></ul><ul><li>• involves sharing learning goals with pupils; </li></ul><ul><li>• aims to help pupils to know and to recognize the standards they are aiming for; </li></ul><ul><li>• involves pupils in self-assessment; </li></ul><ul><li>• provides feedback which leads to pupils recognizing their next steps and how to take them; </li></ul><ul><li>• is underpinned by confidence that every student can improve; </li></ul><ul><li>• involves both teacher and pupils reviewing and reflecting on assessment data. </li></ul>
  27. 27. Instructional Cycle What if they already know it? Differentiate Differentiate Blooms update!
  28. 28. Data Collection <ul><li>Item analysis without technology is almost impossible (time-wise). </li></ul><ul><li>Bubble tests -Remark OMR software </li></ul><ul><li>ARS – Clickers </li></ul><ul><li>Respondus/Bb tests </li></ul><ul><li>Rubrics (complex tasks, subjective) </li></ul><ul><li>Exit cards (self-assessment with bubbles, subjective) </li></ul>
  29. 29. Analyze the Results <ul><li>Did the student(s) learn? </li></ul><ul><li>Can they demonstrate the skill? </li></ul><ul><li>What do the results of the analysis tell you in terms of what to do next? (inform your instruction) </li></ul><ul><li>Ideally we have teams (PLC) to discuss the findings </li></ul><ul><li>What lessons, techniques, practices worked (tell the story) </li></ul><ul><li>Retest/assess to measure growth (PLC) with questions that measure same objectives </li></ul>
  30. 30. Pinnacle Analytics <ul><li>District’s data warehousing software </li></ul><ul><li>OAT, OGT, Stanford, Early Literacy, Grades, other. </li></ul><ul><li>Still slow moving…data from spring testing not in PA because SASI rosters aren’t built yet. </li></ul><ul><li>This means we can look at the data, but not in the most useful ways. </li></ul><ul><li>We need admin to get class placements set earlier. </li></ul>
  31. 31. Excel <ul><li>PA allows you easily (really!) into Excel. </li></ul><ul><li>What do you want to do with Excel? </li></ul><ul><li>Rank order, compare your class to others </li></ul><ul><li>Analyze growth from one grade to the next </li></ul><ul><li>Means, Standard Deviation? </li></ul><ul><li>Share results with students….allow them to play along with data results </li></ul>