Ideas for Rubrics in STEM Higher
Education
Teaching & Learning
Jace Hargis, PhD
Agenda
Building Rubrics
IP
Assess
Bloom
BWDesign
-SLO
-Evidence
-Instruct
Rubric Examples
Following this session, the participant will
•
Integrate Information Processing as a
foundational aspect of teaching, learning and
assessment, indicated by the frequency of
formative assessments;
•
Create Rubrics addressing authentic assessment,
measured by the ‘Evidence’ step of Backward
Design.
Information Processing
(Atkinson and Shiffrin,1971)
Input Sensory STM LTM
Recall
“Rubrics”
•
Assessment: Vehicle for gathering
information about learners’ behavior.
•
Measurement: Assignment of marks based
on an explicit set of criteria.
•
Evaluation: Process of making judgments
about the level of understanding.
Assessment of Understanding
You really understand when you can…
1. explain, connect, systematize, predict;
2. show its meaning, importance;
3. apply or adapt it to novel situations;
4. question its assumptions;
5. see it as its author saw it; and
6. identify misconceptions or simplistic views
What is Formative Assessment?
•
Part of instructional process
•
Provides information at a point when
adjustments to teaching can be made.
•
Self-reflective process - bidirectional.
Blooms Taxonomy for Assessment
Knowledge (facts): Who, What, Why, When, Where
Comprehension (interpret): Example, Classify, Infer
Application (new situations): Predict, Select, Identify
Analysis (break into parts): Distinguish, Conclusions
Synthesis (patterns): Create, Propose, Plan, Design
Evaluation (criteria): Appraise, Criticize, Defend
Course [re]Design: Understanding by
Backward Design
Wiggins &
McTighe (1998)
1. Identify desired results
2. Determine acceptable evidence
3. Plan learning experiences
& instruction
Student Learning Outcome (SLO)
•
Knowledge, Skills & Dispositions
– active, high level
– specifically under certain conditions;
– to what degree they will be measured.
– Substance (subject);
– Form (action the learner performs)
[analyze, demonstrate, derive, integrate, interpret, propose]
1. Identify Desired Results
•
Authentic
•
Aligned to Outcomes
•
Experiential
•
Measureable
2. Determine Acceptable Evidence
Using Rubrics …
Students
SLOs
Assessment
Bloom’s
Taxonomy
Course-specific
goals & objectives
Cooperative
learning
Lectures Labs Other
experiences
Classroom
assessment
techniques
Projects
Instruction
Other
measures
Technology
(Felder & Brent, 1999)
3. Plan learning experiences & instruction
Measuring Assessment Using a Rubric
•
Oxford Dictionary – mid 15th century, rubric
referred to headings of sections of a book,
stemmed from monks who reproduced
literature, initiating each section of a copied
book with a large red letter. The Latin word
for red is ruber, rubric came to signify the
headings for divisions of a book.
What are Measurement Rubrics
•
Bridge between SLOs and assessment;
•
Students and teachers use;
•
Defines criteria, especially in dealing with
processes or abstract concepts;
•
Common language to assess complex process.
•
3 Features – evaluative criteria; quality
definitions; scoring strategy
Example Rubric -
Engineering Lab Reports
Critical Thinking Rubric
Rubric Resources
•
Rubistar Online Rubric Tool
(http://rubistar.4teachers.org)
•
Assoc. for Assess of Learning in Hi Ed
(http://course1.winona.edu/shatfield/air/rubrics.htm)
•
Rubrix ($50) - http://rubrix.com
•
UCF Faculty Center Rubric Page
www.fctl.ucf.edu/TeachingAndLearningResources/CourseDesign/Assessment/AssessmentToolsResources/rubrics.php
How Do You Know When A Rubric is Good?
•
Reliable (r, between -1 and 1)
• consistency which the measurement provides;
• how well a test agrees with itself;
• is it measuring same thing with similar results.
•
Valid
– does it measure what it is intended to measure;
– differences represent true differences in what is
being measured and not from other factor!
Thank You!
Jace Hargis, PhD

Jace Hargis Rubrics for STEM

  • 1.
    Ideas for Rubricsin STEM Higher Education Teaching & Learning Jace Hargis, PhD
  • 2.
  • 3.
    Following this session,the participant will • Integrate Information Processing as a foundational aspect of teaching, learning and assessment, indicated by the frequency of formative assessments; • Create Rubrics addressing authentic assessment, measured by the ‘Evidence’ step of Backward Design.
  • 4.
    Information Processing (Atkinson andShiffrin,1971) Input Sensory STM LTM Recall “Rubrics”
  • 5.
    • Assessment: Vehicle forgathering information about learners’ behavior. • Measurement: Assignment of marks based on an explicit set of criteria. • Evaluation: Process of making judgments about the level of understanding.
  • 6.
    Assessment of Understanding Youreally understand when you can… 1. explain, connect, systematize, predict; 2. show its meaning, importance; 3. apply or adapt it to novel situations; 4. question its assumptions; 5. see it as its author saw it; and 6. identify misconceptions or simplistic views
  • 7.
    What is FormativeAssessment? • Part of instructional process • Provides information at a point when adjustments to teaching can be made. • Self-reflective process - bidirectional.
  • 8.
    Blooms Taxonomy forAssessment Knowledge (facts): Who, What, Why, When, Where Comprehension (interpret): Example, Classify, Infer Application (new situations): Predict, Select, Identify Analysis (break into parts): Distinguish, Conclusions Synthesis (patterns): Create, Propose, Plan, Design Evaluation (criteria): Appraise, Criticize, Defend
  • 9.
    Course [re]Design: Understandingby Backward Design Wiggins & McTighe (1998) 1. Identify desired results 2. Determine acceptable evidence 3. Plan learning experiences & instruction
  • 10.
    Student Learning Outcome(SLO) • Knowledge, Skills & Dispositions – active, high level – specifically under certain conditions; – to what degree they will be measured. – Substance (subject); – Form (action the learner performs) [analyze, demonstrate, derive, integrate, interpret, propose] 1. Identify Desired Results
  • 11.
  • 12.
    Students SLOs Assessment Bloom’s Taxonomy Course-specific goals & objectives Cooperative learning LecturesLabs Other experiences Classroom assessment techniques Projects Instruction Other measures Technology (Felder & Brent, 1999) 3. Plan learning experiences & instruction
  • 13.
    Measuring Assessment Usinga Rubric • Oxford Dictionary – mid 15th century, rubric referred to headings of sections of a book, stemmed from monks who reproduced literature, initiating each section of a copied book with a large red letter. The Latin word for red is ruber, rubric came to signify the headings for divisions of a book.
  • 14.
    What are MeasurementRubrics • Bridge between SLOs and assessment; • Students and teachers use; • Defines criteria, especially in dealing with processes or abstract concepts; • Common language to assess complex process. • 3 Features – evaluative criteria; quality definitions; scoring strategy
  • 15.
  • 17.
  • 18.
  • 19.
    Rubric Resources • Rubistar OnlineRubric Tool (http://rubistar.4teachers.org) • Assoc. for Assess of Learning in Hi Ed (http://course1.winona.edu/shatfield/air/rubrics.htm) • Rubrix ($50) - http://rubrix.com • UCF Faculty Center Rubric Page www.fctl.ucf.edu/TeachingAndLearningResources/CourseDesign/Assessment/AssessmentToolsResources/rubrics.php
  • 20.
    How Do YouKnow When A Rubric is Good? • Reliable (r, between -1 and 1) • consistency which the measurement provides; • how well a test agrees with itself; • is it measuring same thing with similar results. • Valid – does it measure what it is intended to measure; – differences represent true differences in what is being measured and not from other factor!
  • 21.