2. Assessment
Assessment can occur at different times
during the semester and there are many
different types of assessment like formal
or informal; summative or formative;
rubric. This session will review different
types of assessment and review
assessment techniques that you are using
in your courses.
2
3. Video Clip
Spies Like Us
https://www.youtube.com/watch?v=gk5R
9GmIsAs
4. Warm-Up
1. What is the purpose of assessment?
2. What are your assessment strengths?
3. What questions do you have about
assessment?
4
5. Assessment and Evaluation
Assessment is a process of determining
"what is."
Evaluation uses evidence generated
through assessment to make judgments of
relative value.
"If you don't have any goals, you don't have
anything to assess"
5
6. Video Clip
The Paper Chase: Socratic Method
https://www.youtube.com/watch?v=qx22
TyCge7w
6
7. 7
Potential Impact of Assessments
Find out what and how your students are
thinking
Clarify your goals
Get feedback to make mid-course corrections
Become exposed to how students learn your
discipline and identify means to respond to
different learning styles
Help students become self-aware of their
learning
Leave behind a trail of information that can be
use for post-course improvement (for students
and teacher)
8. 8
Five Dimensions of Learning
Declarative Learning (What)
Procedural Learning (How)
Conditional Learning (When & Where)
Reflective Learning (Why)
Metacognitive Learning (How to Learn)
9. Validity
Needs a Clear Purpose
9
Face validity:
do the assessment items appear to be
appropriate?
Content validity:
does the assessment content cover
what you want to assess?
Criterion-
related validity:
how well does the test measure what
you want it to?
Construct validity:
are you measuring what you think
you're measuring?
10. Reliability
The length of the assessment – a longer assessment generally
produces more reliable results.
The suitability of the questions or tasks for the students being
assessed.
The phrasing and terminology of the questions.
The consistency in test administration – for example, the
length of time given for the assessment, instructions given to
students before the test.
The design of the marking schedule and moderation of
marking procedures.
The readiness of students for the assessment – for example,
a hot afternoon or straight after physical activity might not be
the best time for students to be assessed.
10
12. Assessing the Objective
12
Formative=Ongoing
Gather feedback that can be used by
the instructor and the students to
guide improvements in the ongoing
teaching and learning.
Example: Early course evaluations,
quizzes, scaffolding essays with an
outline
Summative=Summary
Measure the level of success or
proficiency that has been obtained at
the end of an instructional unit, by
comparing it against some standard or
benchmark.
Example: Post unit exams, portfolio
assignments
Formal
Have data which support the
conclusions made from the test. We
usually refer to these types of tests as
standardized measures.
Example: Standardized exams with
mathematical scores.
Informal
Not data driven but rather content
and performance driven.
Examples: Performance
assessments(recital), single subject
designs
13. Formative
Formative Assessment
Formative assessment provides feedback
and information during the
instructional process, while learning is
taking place, and while learning is
occurring. Formative assessment
measures student progress but it can
also assess your own progress as an
instructor.
13
14. Ongoing Assessments for
Students
1. Journal entry
2. Short answer test
3. Open response test
4. Oral responses during class discussion
5. Portfolios
6. Exhibition
7. Culminating product
14
15. Summative
Summative assessment takes place after
the learning has been completed
and provides information and feedback
that sums up the teaching and
learning process. Typically, no more formal
learning is taking place at
this stage, other than incidental learning
which might take place through
the completion of projects and
assignments. 15
16. Using a Rubric
A rubric is a scoring tool that explicitly
describes the instructor’s performance
expectations
16
17. Types of Rubrics
Holistic
gives a single score or rating for an
entire product or performance based
on an overall impression of a student’s
work
Analytic
divides a product or performance into
essential traits or dimensions so that
they can be judged separately—one
analyzes a product or performance for
essential traits
17
20. Steps to Rubric Development
Determine learning outcomes
Keep it short and simple (Include 4 – 15 items; use
brief statements or phrases)
Each rubric item should focus on a different skill
Focus on how students develop and express their
learning
Evaluate only measurable criteria
Ideally, the entire rubric should fit on one sheet of
paper
Reevaluate the rubric (Did it work? Was it sufficiently
detailed?)
20
21. Terms Used to Measure
Needs
Improvement…Satisfactory…Good…Exemplary
Beginning…Developing…Accomplished…Exemplary
Needs work…Good…Excellent
Novice…Apprentice…Proficient…Distinguished
Numeric scale ranging from 1 to 5, for example
21
22. Benefits of Rubrics- Instructor
reduce the time spent grading
help instructors more clearly identify
strengths and weaknesses across an entire
class and adjust their instruction
help to ensure consistency across time
and students
reduce the uncertainty which can
accompany grading
discourage complaints about grades
22
23. Benefits of Rubrics-Student
understand instructors’ expectations and
standards
use instructor feedback to improve their
performance
monitor and assess their progress as they
work towards clearly indicated goals
recognize their strengths and weaknesses
and direct their efforts accordingly
23
24. Paper
Example 1: Philosophy Paper This rubric was designed for student papers in a range of courses in philosophy
(Carnegie Mellon).
Example 2: Psychology Assignment Short, concept application homework assignment in cognitive psychology
(Carnegie Mellon).
Example 3: Anthropology Writing Assignments This rubric was designed for a series of short writing assignments
in anthropology (Carnegie Mellon).
Example 4: History Research Paper. This rubric was designed for essays and research papers in history (Carnegie
Mellon).
Projects
Example 1: Capstone Project in Design This rubric describes the components and standards of performance from
the research phase to the final presentation for a senior capstone project in design (Carnegie Mellon).
Example 2: Engineering Design Project This rubric describes performance standards for three aspects of a team
project: research and design, communication, and team work.
Oral Presentations
Example 1: Oral Exam This rubric describes a set of components and standards for assessing performance on an
oral exam in an upper-division course in history (Carnegie Mellon).
Example 2: Oral Communication This rubric is adapted from Huba and Freed, 2000.
Example 3: Group Presentations This rubric describes a set of components and standards for assessing group
presentations in history (Carnegie Mellon).
Class Participation/Contributions
Example 1: Discussion Class This rubric assesses the quality of student contributions to class discussions. This is
appropriate for an undergraduate-level course (Carnegie Mellon).
Example 2: Advanced Seminar This rubric is designed for assessing discussion performance in an advanced
undergraduate or graduate seminar.
24