4. Rubrics
• Used to score or rate
• Articulates what is being measured and the scale it is
being measured on
• Descriptive rubrics: the gold standard
• The scales is further explained for each item being measured
5. A Descriptive Rubric Example
Needs
Improvement
(1)
Developing
(2)
Sufficient (3)
Above
Average (4)
Clarity
(Thesis
supported by
relevant
information
and ideas.)
The purpose
of the student
work is not
well-defined.
Central ideas
are not
focused to
support the
thesis.
Thoughts
appear
disconnected.
The central
purpose of the
student work
is identified.
Ideas are
generally
focused in a
way that
supports the
thesis.
The central
purpose of the
student work
is clear and
ideas are
almost always
focused in a
way that
supports the
thesis.
Relevant
details
illustrate the
author’s
ideas.
The central
purpose of the
student work
is clear and
supporting
ideas always
are always
well-focused.
Details are
relevant,
enrich the
work.
6. Your Experience with Rubrics
• Academic assignments
• Judged contests (talent show, costume contest, etc.)
• Performance evaluations
• Candidate evaluations
• To an extent – some survey questions resemble rubrics
8. The Choice: Rooted in Mapping
• Direct measure for learning outcomes
• Context for learning for measuring: mostly advising
appointments
• Ease of use
• Versatility
• Conversations in advisement appointments
• “Assignments” completed by advisees
• Timeframes for use
9. Rubric Versatility
Advising Structure:
• College of Arts and Sciences
• School of Architecture and
Planning
• School of Engineering and
Applied Sciences
• School of Management
• School of Medicine and
Biomedical Sciences
• School of Nursing
• School of Pharmacy and
Pharmaceutical Sciences
• School of Public Health and
Health Professions
• Student Advising Services
• Athletics
• Access to College Excellence
Program (ACE)
• Daniel Acker Scholars
• Educational Opportunity
Program (EOP)
• Student Support Services
(SSS)
• University Honors College
12. Development & Testing
• Developing the criteria and descriptions
• Decision to use a 3 point scale
• Testing
• Several advisors form different departments
• Multiple contexts
• Increased support for the tool
14. Decisions and Tradeoffs
Choice
• Flexibility for sampling
method
• Timeline
• Sample size
Tradeoffs
• Challenges to viability of
the data
15. Decisions and Tradeoffs
Choice
• Baseline tool
• Availability
• Ease of use
• Ease of set up
Tradeoffs
• Lack of flexibility to make
certain customizations
• Comment sections
• Criteria descriptions
• Built in reports do not say
much
• Advisors cannot view the
data they entered
16. Decisions and Tradeoffs
Choice
• Entering student person
number
• Demographic data
• Makes using the tool faster
when other demographic
data is not entered
Tradeoffs
• Some data gets lost when
ID numbers get entered
incorrectly
18. Challenges
• Training
• People do not understand what a rubric is at first
• When to do the assessment
• Limitations of the Baseline tool
• Challenges to the data based on the tradeoffs
31. “Closing the Loop”
• Share data
• Engage stakeholders
• Make interpretations
• Set targets for improvement
• Brainstorm ideas for improvement
• Implement changes for improvement
• Continue to assess after changes are implimented