Assessment with a Purpose

641 views

Published on

Teacher training presentation on assessment use and purpose.

Published in: Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
641
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
24
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Are we assessing for the sake of assessment, or are we assessing purposefully and thoughtfully, in a manner that makes the time invested worth it?
  • Are we assessing for the sake of assessment, or are we assessing purposefully and thoughtfully, in a manner that makes the time invested worth it?
  • Use your data teams to help you prioritize your instructional goals. Do you have power or priority standards established? How do you determine your core standards. If this work is done alone, you have the problem of differing priorities and interpretations. Part of a collaborative effort to a vertical continuum and congruence across grades.
  • S-17 Just an organizer for you. As we look at the four suggested uses of classroom assessment, think about whether these elements are part of your classroom practice. If you are an administrator, how aware are you of these practices occurring in your classrooms?
  • Note that the post test only model doesn’t factor in the students’ pre-instructional status. This makes it difficult to determine whether instruction impacted student learning, or other factors. Pretesting and then comparing post testing results to pretest results allows a teacher to determine his/her impact instructionally, on student learning. The student becomes his/her own “control” in this model. Where high mobility of students is an issue, you would analyze the cohort of students who were pre and post tested for evaluating your own impact. Evaluate all post test scores for all students to gauge student mastery of the content, but use the cohort pre to post test comparison to help you understand your instructional impact. This can also be a time saver because it wraps back to what we said assessing prior learning. You may already be doing this , but how intentional are you? How are you using the data teams/plcs to help you do this work? Helps determine instructional impact. Students can chart their own progress. Recommendation 2 IES Practices Guide
  • Discuss and share out. Have someone record a list popcorned out by participants.
  • So how do you do it? “Flexible, en route test-guided instructional scheduling can allow your students to move on to fascinating application activities or delve more deeply into other content areas.” page 12.
  • Have participants read the benchmark, testing tactics and instructional implications on page 24 of the book. See page 23 for set. Mention this is oversimplification for illustration only. This is going to lead to the discussion of test-triggered instruction where teachers get an idea of how the content standard is operationalized through reflecting on the test item and considering the cognitive demand of the task, as well as the skills needed to successfully answer the item. This should lead into a discussion about generalizable skill-master. Teaching to the test results in a narrow focus on specific tasks presented in the items. However, teaching toward test-represented targets considers several different ways the content standard is tested. The follow up is to determine the skills, knowledge, subskills and prior knowledge and cognitive demand of the task and to use this information to build instruction and classroom assessment. When teachers then use diverse methods of assessment of a content standard, then the teacher is seeking to get a fix on students’ ‘generalizable mastery. The more diverse the assessment techniques, the stronger the inferences you can draw about the cognitive demand your assessments are placing on your students. Once you have an idea about the cognitive demand of the task and your student’s readiness, you can use diverse, assessment grounded instructional methods to build generalizability of the skills and knowledge.
  • S-18 of supplemental materials. Meet with clock buddy to discuss your 3 big ideas. Switch to next clock buddy for further sharing to make sure people who did homework get chance to be with someone else who did homework.
  • What constitutes validity in creating an exam? In selecting one? Look for alignment. Curriculum is so large that you can’t hit it all, so prioritize. Instruction is targeted at the most important subsets of the curriculum, power standards. The assessment samples the content you taught.
  • S-19 Put this in supplemental and refer to this at this time for team exercise. What do validity and reliability look like? Have handout of this for them to work on individually, then they are to discuss with a partner. Then discuss next slide. The targets represent what you are trying to measure, standard, curricular goal, etc. The green lines are the scores on an assessment designed to measure the target. Think of these as four different tests of the same standard. A group of students with the same ability level take each of the four exams. The green dashes represent their scores on each of the four tests. What does each one represent in terms of validity and reliability?
  • Validity and reliability of scores or test results determine the extent to which you can making meaningful interpretations or inferences, as well as the degree of confidence you can place in the interpretations This is particularly important when using scores for prediction of future performance.
  • These questions sum up the assessment framework process we’ve been engaged in for the day. These questions should be part of your consideration in acquiring or developing assessment systems. DIBELS: Idea of form effects because scales aren’t built for progress monitoring or adjusting for skill development over time. Parallel forms not available or insufficient in amount to avoid form effect.
  • These questions sum up the assessment framework process we’ve been engaged in for the day. These questions should be part of your consideration in acquiring or developing assessment systems. DIBELS: Idea of form effects because scales aren’t built for progress monitoring or adjusting for skill development over time. Parallel forms not available or insufficient in amount to avoid form effect.
  • Assessment with a Purpose

    1. 1. Assessment for the sake of assessment? Or, Assessment with a purpose!
    2. 2. Take a few minutes to journal on the topic: What do you know about using classroom assessment? What puzzles you? How can you explore this topic today?
    3. 3. So many targets, what is a teacher to do? <ul><li>Prioritize based on importance (performance standards and data driven needs) </li></ul><ul><li>Use data teams and action research to prioritize focus of your change or improvement efforts </li></ul><ul><li>Share effective strategies for instruction </li></ul><ul><li>Create formative measures </li></ul><ul><li>Monitor results </li></ul>
    4. 4. Use of assessment Happens? To what degree? Decisions about curriculum alignment Decisions about students’ prior knowledge Decisions about how long to teach something Decisions about effectiveness of instruction
    5. 5. Simple, but powerful model…
    6. 6. Remember KWL? <ul><li>Can also pre-assess through informal means like KWL </li></ul><ul><li>What other pre-assessments do you use that aren’t a commercial product? </li></ul><ul><li>Brainstorm with your team a quick list of pre-assessment strategies. Be prepared to share at least 2! </li></ul>
    7. 7. The Dipstick Assessment: How long do I need to teach this? <ul><li>Item-sampling method for quick assessment </li></ul><ul><ul><li>Different students complete different subsamples of items from your unit test (a couple of items each) </li></ul></ul><ul><ul><li>Takes less than five minutes to administer to students </li></ul></ul><ul><ul><li>Gives quick fix on status of entire class—not intended for inferences about individual students </li></ul></ul>Popham, p. 12
    8. 8. What about Generalizability? Aka: Why diverse stimuli matter! <ul><li>Look at various ways a content standard is assessed </li></ul><ul><li>Teach toward the skills or knowledge a test represents, not toward test itself. </li></ul><ul><li>Extend, apply, etc. for generalizability </li></ul>Popham, pp. 23-27
    9. 9. Big Ideas from Stat 101 Validity Reliability <ul><li>IS this test instrument valid? </li></ul><ul><li>The test measures what you are trying to measure. </li></ul><ul><li>Are you assessing what you are instructing? </li></ul><ul><li>IS this test reliable? </li></ul><ul><li>Do I get consistent results over time? </li></ul><ul><li>Is the sample size of items or responses large enough that you can make inferences from the ? </li></ul>
    10. 10. What constitutes valid scores? <ul><li>Align assessment with: </li></ul><ul><li>Objectives for instruction ( content and skills you plan to teach) </li></ul><ul><li>Actual instruction that preceded assessment (content and skills you actually taught) </li></ul>Decisions or conclusions you plan to make using interpretation of resulting scores
    11. 11. What does it look like? Valid and reliable shooting Unreliable and invalid shooting Reliable but invalid shooting
    12. 12. Valid? Reliable? Both? Neither? Mertler, Craig A., (2003). Classroom Assessment: A Practical Guide for Educators. Pryczak Publishing, Los Angeles, CA.
    13. 13. Valid? Reliable? Both? Neither?
    14. 14. To Sum Up: When selecting and implementing assessments to augment state and classroom formative assessment… Ask yourself these important questions!
    15. 15. Important Questions to Ask <ul><li>What do you want to learn from the assessment? </li></ul><ul><li>What steps will be taken as a result of the assessment? What decisions are the results intended to inform? </li></ul><ul><li>How will student learning improve as a result of using the assessment system? </li></ul><ul><li>Will it improve more than if the assessment system wasn’t used? </li></ul>Center for Assessment and Accountability, WestEd, 2008
    16. 16. Finally, let’s review our own questions <ul><li>Take a few minutes to reread your journal from the beginning of the session. Turn to an elbow partner and share: </li></ul><ul><li>I used to think …. .about assessment in my classroom. </li></ul><ul><li>But now I think … </li></ul>

    ×