EDPC605_7&8

  • 138 views
Uploaded on

 

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
138
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
2
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. + Thinking Like an Assessor EDPC605 Chapter 7
  • 2. + Overview  Chapter seven requires the teacher to think like an assessor and reflect on what he/she does to check for student understanding.  How do we know that the student “got it”?  Consider the following judicial analogy: “Students are innocent (of understanding, skill, and so on) until proven guilty by a preponderance of evidence that is much more than circumstantial” (p. 148)
  • 3. + Stage 2 Assessor’s Questions  What evidence can show that students have achieved the desired results?  What assessment tasks and other evidence will anchor our curricular units and thus guide our instruction?  What should we look for, to determine the extent of student understanding?
  • 4. + Two Approaches to Thinking About Assessment
  • 5. + A Continuum of Assessments “Effective assessment is more like a scrapbook of mementos and pictures than a single snapshot” (p. 152) “Effective teacher-assessors gather lots of evidence along the way, using a variety of methods and formats” (p. 152) “Understanding develops as a result of ongoing inquiry and rethinking” (p. 152)
  • 6. + An assessment, problem, or project is authentic if it:  Is realistically contextualized (abilities can be tested in real- world situations)  Requires judgment and innovation (student has to develop a plan or procedure for solving the problem)  Asks the student to “do” the subject  Replicates key situations in which adults are truly “tested” in the workplace, civic life, and in personal life  Assesses the student’s ability to efficiently and effectively use a repertoire of knowledge and multistage task  Allows appropriate opportunities to rehearse, practice, consult resources and get feedback on and refine performances and products
  • 7. + Problems not just exercises  Authentic problem solving requires deciding when to use which approach and which facts  Real performance always involves transfer  The performers must figure out which knowledge and skill is needed on their own, without simplifying teacher prompts or cues, to solve the real problems of performance.
  • 8. + GRASPS  It is a design tool used to assist in the creation of performance tasks. Goal (the goal is to…) Role (your job is…) Audience (you need to convince…) Situation (the challenge involves dealing with…) Performance (you will create a …) Standards (your work will be judged by…)
  • 9. + Using the 6 facets as assessment blueprints Students who really understand … Facet 1. Can explain Facet 2. Can Interpret Facet 3. Can Apply Facet 4. Can see in perspective Facet 5. Demonstrate empathy Facet 6. Reveal self-knowledge
  • 10. +“Answers without reasons and support aretypically insufficient to “convict” the learner ofunderstanding” (p. 161)
  • 11. + Chapter 8: Criteria and Validity
  • 12. + The need for criteria  When assessing, criteria is needed “because the kinds of open-ended prompts and performance tasks needed to assess for understanding do not have a single, correct answer or solution process.” (Wiggins & McTighe, 172)  “Criteria highlight the most revealing and important aspects of the work.” (173)  “We must clarify a set of independent variables in the performance that affect our judgment of quality.” (173)  Criteria needs to be “central to the performance and its purpose.” (173)
  • 13. + From criteria to rubric  Rubrics are criterion-based  Rubrics answer the following questions:  How performance should be judged or discriminated.  Where we should look and what we should look for.  How the levels of quality, proficiency, or understanding be described and distinguished from one another. (Wiggins & McTighe, 173)
  • 14. + Types of rubrics  Holistic  Provides an overall impression of the student’s work.  You give one grade.  Analytic  Divides product of performance into traits and each one is judged separately.
  • 15. + Rubrics to assess understanding  Understanding is assessed as a degree on a continuum. A rubric that assesses understanding “must provide concrete answers to our key assessment questions.”  Othertraits, such as mechanics, craftsmanship, and organization should be judged separately.
  • 16. + Backward design from criteria and rubrics  Use the standards as a guide to what criteria should be judged.  Develop a rubric from these criteria.
  • 17. The facets and criteria  Understanding is revealed through the following six facets.  This table gives a partial list of criteria for each facet that can be used to create a rubric. (For sample rubric see Figure 8.3 on p. 178-179) Facet 1 Facet 2 Facet 3 Facet 4 Facet 5 Facet 6Explanation Interpretation Application Perspective Empathy Self-knowledgeaccurate meaningful effective credible sensitive self-awarecoherent insightful efficient revealing open metacognitivejustified significant fluent insightful receptive self-adjustingsystematic illustrative adaptive plausible perceptive reflectivepredictive illuminating graceful unusual tactful wise
  • 18. + Designing and refining rubrics based on student work A rubric is built and revised through the analysis of student work.  Arterand McTighe (2001, pp. 37-44) propose a six-step process for analyzing student work…
  • 19. +  Step 1: Gather student samples that show the desired level or understanding  Step 2: Sort the work and reasons for that specific sort.  Step 3: Cluster the reasons into traits  Step 4: Write a definition of each trait  Step 5: Select anchor papers for each score point on each trait.  Step 6: Continuously refine.
  • 20. + The challenge of validity  “We typically pay too much attention to correctness and too little attention to the degree of understanding.”  “Understanding usually falls through the cracks of typical testing and grading.”  When developing tasks for students (Stage 2), a teacher faces the challenge of validity. This means that we need to ask our students to produce appropriate evidence for the desired results framed in Stage 1.
  • 21. + Backward design to the rescue  Usingbackward design can help ensure that assessments are valid. There is an extra step added to Stage 2. Go through the following steps and complete the sentences:  Stage 1:  If the desired result is for learners to…  Stage 2:  Then you need to evidence of the student’s ability to…  So the assessments need to include some things like…  There is a Self-Test of Assessment Ideas (Figure 8.5) that teachers can use to evaluate the validity of a task. (Wiggins & McTighe, 186-187)
  • 22. + Rubric validity  Rubrics also need to be assessed for validity.  Ask yourself the following questions:  Could the criteria be met without the performer demonstrating deep understanding?  Could the criteria not be met while the performer shows understanding?  Ifyou can answer yes to either one of these questions, then your rubric and criteria are not going to produce valid information.
  • 23. + Reliability: Our confidence in the pattern  We cannot use a “snapshot” to judge a student’s degree of undestanding.  We need to use many and various methods in order to assess students so that we can conclude whether or not they are demonstrating understanding.
  • 24. + A caveat before closing  Thischapter focuses on formal and summative tests; however, ongoing assessments and teacher observations also play a major role in providing us with a “scrapbook of evidence” for understanding.