Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Designing Rubrics for Competency-based Education


Published on

Presentation at the Competency-based Education Workshop sponsored by Penn State's Center for Online Innovation in Learning on October 29, 2015.

Published in: Education
  • If you’re looking for a great essay service then you should check out ⇒ ⇐. A friend of mine asked them to write a whole dissertation for him and he said it turned out great! Afterwards I also ordered an essay from them and I was very happy with the work I got too.
    Are you sure you want to  Yes  No
    Your message goes here
  • Hello! I can recommend a site that has helped me. It's called ⇒ ⇐ They helped me for writing my quality research paper.
    Are you sure you want to  Yes  No
    Your message goes here
  • Did u try to use external powers for studying? Like ? They helped me a lot once.
    Are you sure you want to  Yes  No
    Your message goes here

Designing Rubrics for Competency-based Education

  1. 1. D E S I G N I N G R U B R I C S 
 F O R C O M P E T E N C Y- B A S E D E D U C AT I O N C E N T E R F O R O N L I N E I N N O VAT I O N I N L E A R N I N G ( C O I L ) C O M P E T E N C Y- B A S E D E D U C AT I O N W O R K S H O P ( 1 0 / 2 9 / 2 0 1 5 ) K Y L E P E C K 
  2. 2. “ W H AT W I L L P E O P L E N E E D F R O M U S ? ” • Access to high-quality content will increasingly be free. • MOOCs and other forms of peer- and machine- evaluated learning experiences will improve dramatically. • Our primary service will be issuing high-quality credentials based on high-quality assessments of 
 higher-order capabilities. • (And perhaps creating and sustaining learning communities, but that’s another discussion.)
  3. 3. T H E P O W E R O F “ F E E D B A C K ” P E R F O R M A N C E A S S E S S M E N T F E E D B A C K F O C U S E D 
  4. 4. T H E R U B R I C : T H E P R E F E R R E D WAY T O A S S E S S H I G H E R - O R D E R L E A R N I N G • Easy to use and to explain • Make expectations very clear to learners • Provide students with more and better feedback about their strengths and areas in need of improvement • Support learning and the development of skills • Support good thinking. • Based on “Rubrics and Bloom’s Taxonomy” Wiki at Licensed under a Creative Commons Attribution Share-Alike 3.0 License.
  5. 5. W H AT I S A R U B R I C ? • A rubric is a scoring guide that seeks to evaluate a student's performance based on the sum of a full range of criteria rather than a single numerical score. • A rubric is an authentic assessment tool used to measure students' work. • Authentic assessment is used to evaluate students' work by measuring the product according to real-life criteria. • A rubric is a working guide for students and teachers, usually handed out before the assignment begins in order to get students to think about the criteria on which their work will be judged. From
  6. 6. T Y P E S O F R U B R I C S • “Holistic” Rubrics • “Analytical Rubrics” • “Task-Specific” Rubrics • “General” Rubrics
  7. 7. H O L I S T I C R U B R I C S • Make a single assessment on the “overall quality” of the project • Are quick and easy • May be reliable (?) but are not likely to be as valid as analytic rubrics • Provide little information to the user
  8. 8. A N E X A M P L E O F A H O L I S T I C R U B R I C “Consistently does all or most of the following:” (List of good things) “Does most or many of the following:”
 (Same list of good things) “Does most or many of the following:” (List of bad things) “Consistently does all or almost all of the following:” (Moderately different list of bad things)
  9. 9. H O L I S T I C R U B R I C S ? • Good for Sorting. • Not good for understanding or improving performance
  10. 10. A N A LY T I C R U B R I C S • Identify the criteria that are important to a quality product or performance. • Identify levels or ratings for each criterion • Identify descriptions of performances on each criterion at each level • Often provide scores for each criterion, based on ratings and sum the scores to get an overall score or to produce a grade.
  11. 11. A N AT O M Y O F A ( T Y P I C A L ) A N A LY T I C R U B R I C Image from “Rubrics and Bloom’s Taxonomy” Wiki at and is licensed under a Creative Commons Attribution Share-Alike 3.0 License.
  12. 12. A S A M P L E R U B R I C Rubric for a Chocolate Chip Cookie Criteria Ratings Descriptions
  13. 13. W H AT ’ S W R O N G W I T H T H I S R U B R I C ? Rubric for a Chocolate Chip Cookie?
  14. 14. A T E M P L AT E F O R A N A N A LY T I C R U B R I C sample-of-analytic-scoring-rubrics?related=1
  15. 15. M O S T R U B R I C S 
 H AV E R E A L P R O B L E M S ! ! • Validity, the extent to which an assessment measures what it claims to measure, is compromised by an imbalance in the number of criteria of a given type, which places undue emphasis on less important factors. • When there in an imbalance among criteria, the resulting assessment is misleading. • This can be resolved by grouping and weighting criteria, but most rubrics don’t.
  16. 16. M O S T R U B R I C S 
 H AV E R E A L P R O B L E M S ! ! • Reliability, the extent to which an assessment produces stable scores for the same product or performance when used by different reviewers or when used repeatedly by the same reviewer, is compromised by using the same number of ratings (usually 4) for each criterion. • Some criteria really have only two levels, others may have many. • Forcing an inappropriate number of categories will increase the probability that raters will choose different ratings. • Using ratings that reflect actual performance will increase reliability.
  17. 17. M O S T R U B R I C S 
 H AV E R E A L P R O B L E M S ! ! • When “multidimensional” criteria are used: • the quality and utility of feedback are reduced • scoring is made more difficult, and • reliability and validity are reduced.
  18. 18. A N E X A M P L E O F M U LT I - D I M E N S I O N A L C R I T E R I A “Offers solid but less original reasoning.
 Assumptions are not always recognized or made explicit.
 Contains some appropriate details or examples.”
  19. 19. A B E T T E R , C O M P E T E N C Y- B A S E D A N A LY T I C R U B R I C • Weighted “criteria” collect weighted “indicators” • Indicators can have different numbers of ratings, which have numeric values and can 
 indicate mastery • Descriptions, importance statements, recommendations, and more are stored “behind the scenes”
 when rubrics are created.
  20. 20. N E W R U B R I C - B A S E D T O O L S ? • When assessing, selecting a rating provides stored descriptions and recommendations
 that will be collected to form a narrative report. • These may be edited to personalize the message, as needed. • After the assessments are complete, emails to students and aggregated reports may be generated. The Penn State “Rubric Processor”
  21. 21. V I S U A L R E P O R T F R O M T H E R U B R I C P R O C E S S O R • Red indicates a performance rating that is not at the competency/mastery level. • Gold indicates a performance that has been identified as competency. • A “score” is calculated based on weightings. • Aggregated scores from groups of learners can be represented as percentages in each cell. • Group displays could be used as a “visual query generator,” calling up random, anonymized
 examples of student work at each level.
  22. 22. I N S U M M A RY: • Assessments of higher-order work will become our primary business. • Rubrics are an excellent way to provide high quality assessments of higher-order products and performances. • All rubrics are not created equal; Most have serious flaws. • Effective rubrics are based on sound learning outcomes and corresponding assignments that elicit the desired performances.
  23. 23. N O W I T ’ S Y O U R T U R N ! In your table groups, follow these steps to create a rubric: ! • Start with a well-written learning outcome for a higher- order task. • Identify the aspects of the product or performance that determine quality. Each of these becomes a criterion. (Avoid multi-dimensional criteria! You can group criteria, but don’t combine them.) • Group and/or weight the criteria based on their importance. • Identify the levels of performance expected for each criterion, assign scores for each rating and/or determine which level(s) will be accepted as “mastery” or “competency.”
  24. 24. N E X T S T E P S T O C O M P L E T E Y O U R R U B R I C Consider the following to increase the validity and reliability of your rubric: ! • Share the rubric with experts to establish reliability and identify ways to improve it. • Pilot the rubric with learners to determine: 1. Whether the number of ratings for each criterion is appropriate 2. Whether they convey adequate information to users to result in improvement upon re-submission 3. Whether the top products or performances as indicated by the rubric match experts’ holistic impressions. • Revise as necessary. • Celebrate!
  25. 25. T H A N K Y O U . This presentation is available on