Planning Assessment for
Outcome-based Education
J.J. ADRI JOVIN, M.Tech., PhD, Dip. In Yoga, B.G.L., M.A. (Edu)
Professor, Department of Information Technology,
Sri Ramakrishna Institute of Technology
This work is licensed under CC BY-NC-SA 4.0
Overview
Declaration: This is not an AI-generated presentation. Considerable man-hours were spent on
the preparation.
01.12.2025 Planning Assessment for Outcome-based Education 2
ASSESSMENT IN
HIGHER
EDUCATION
ASSESSMENT AND
GRADES
ARE
ASSESSMENTS
RELIABLE?
THE ASSESSMENT
CYCLE
PLANNING THE
ASSESSMENT
GRADING DESIGN CONSTRUCTING
THE ASSESSMENT
ANALYSIS AND
EVALUATION
Assessment in
Higher Education
• Drives student learning
• Creates moments of reflection
• Extent of Success
• Level of competency students are
expected to acquire for a course/
program
01.12.2025 Planning Assessment for Outcome-based Education 3
Assessment and
Grades
• Poorest form of feedback about complex
performances
• Grades vs Workplace performance
• Gives timely feedback on performance
• Realize intended learning outcomes
• A tangible material
• May deviate with actual skills
01.12.2025 Planning Assessment for Outcome-based Education 4
Values for
Assessment
01.12.2025 Planning Assessment for Outcome-based Education 5
Valid Reliable Transparent
Authentic
Motivate
students to
learn
Inclusive
Sufficiently
diverse
Formative even
if it is intended
to be
summative
Timely
Incremental
Effective
Assessment
• Assessment is used to engage students in learning
that is productive.
• Feedback is used to improve student learning.
• Students and teachers become responsible partners
in learning and assessment.
• Students are inducted into the assessment practices
and cultures of higher education.
• Assessment is placed at the centre of the course
and program design.
• Assessment provides an inclusive and trustworthy
representation of student achievement.
01.12.2025 Planning Assessment for Outcome-based Education 6
Usability of Assessment
01.12.2025 Planning Assessment for Outcome-based Education 7
Utility formula 5 parameters
Reliability
Validity
Educational Effects
Acceptance (of stakeholders)
Cost (Efficiency)
U = R V E A C
ˣ ˣ ˣ ˣ
Method of
Assessment
Reliability Validity
Educational
Effects
Acceptance Cost
Quiz + + +/- + +
Assignment
(Essay) - + + +/- -
Planning Assessment for Outcome-based Education 8
Assessment Cycle
01.12.2025
Design
•Learning Outcomes
•Assessment
•Activities
Construction
•Aligned Assessment
•Grading Scheme, Answer Model and Feedback Plan
Conducting
•Where?
•Allowed instruments
•Time
•Guidance, Proctor
•Standard Protocols
Grading and Feedback
•Provide Insights
Analysis
•Student Performance
•Teaching Quality
•Assessment Quality
Evaluation
•Achieve Goals?
•Desired Educational Effect?
•Valid?
•Reliable?
Institutional
Assessment
Policy
Vision on Education
Framework of Principles
Guidelines
Possible Restrictions
Assessment
Plan
01.12.2025 Planning Assessment for Outcome-based Education 9
Align the
outcomes
with the
methods of
assessment
The 5
factors
Assigning
methods of
assessment,
a relative
weight
Assessment
Matrix
Constructive
Alignment
01.12.2025 Planning Assessment for Outcome-based Education 10
LEARNING
OUTCOMES ACTIVITIES
ASSESSMENTS
Argue different theories Discuss theories in class
Recognise different theories Multiple-choice quiz in class
Checklist
 Learning outcomes reflect levels of learning
 Methods of assessment fit with learning objectives
 Teaching activities should be aligned with assessments and learning activities
Biggs J. Enhancing teaching through constructive alignment. Higher education. 1996 Oct;32(3):347-64.
Assessment Plan
01.12.2025 Planning Assessment for Outcome-based Education 11
What would be the
ideal methods of
assessment to
assess the
outcomes? (Note:
take program
outcomes into
account.)
01 Consider the
function of the
assessment. (Use
the utility formula.
But this is not an
easy task. Take
more time to plan
your
assessments.)
02 Create
assessment plan
overview.
03
LO
Importance
of
LO
Method of Assessment
Individua
l Paper
Assignm
ent
Exam
LO1 20% x x
LO2 30% x x
LO3 30% x
LO4 20% x x
Planning Assessment for Outcome-based Education 12
Assessment Matrix
• How many questions or assignments?
• What type of questions or tasks? (Avoid formulating too many tasks aimed at the same content or
skill)
01.12.2025
Level of cognition/ skills following Bloom’s Taxonomy
LO Remember Understand Apply Analyse Evaluate Create Total Marks
LO1 x x 20%
LO2 x 10%
LO3 x 30%
Total
Marks
Level of cognition/ skills following Bloom’s Taxonomy
LO Remember Understand Apply Analyse Evaluate Create Total Marks
LO1 x x 32
LO2 x 16
LO3 x 48
Total
Marks
Translate Percentage to Mark
Learning outcomes with
multiple cognitive levels
may elongate this process.
Planning Assessment for Outcome-based Education 13
Assessment Matrix for Written Exam
01.12.2025
Level of cognition/ skills following Bloom’s Taxonomy
LO Remember Understand Apply Analyse/ Evaluate/ Create Total Marks
LO1 40 x 1 mark 40
LO2 10 x 2 marks 2 x 10 marks 40
LO3
6 x 8 marks
2 x 16 marks
80
Total Marks 40 marks 20 marks 80 marks 20 marks 160
Quiz Open-Ended Discussion
Planning Assessment for Outcome-based Education 14
Assessment Matrix for Assignment
01.12.2025
Learning Outcome
Grading Criteria
Validity Reliability
LO Assignment I Assignment II
LO1 Criteria 1
LO2 Criteria 2
LO3 Criteria 3
Grading Design
01.12.2025 Planning Assessment for Outcome-based Education 15
Week
Activity/
Assessment
Method of
Feedback
Who provides
Feedback?
Material to Prepare
1
Lecture
Automated
Feedback via online
quiz
Moodle
Online quiz and
feedback
Tutorial
Peer review of first
part of group
assignment
Peer Rubric
Group Assignment Written Feedback Teacher Rubric
Feedback Plan
Rubric
01.12.2025 Planning Assessment for Outcome-based Education 16
Provide good feedback
Communicate with students
Clarification of criteria
Level of correspondence between
assessors
Types of Rubrics
01.12.2025 Planning Assessment for Outcome-based Education 17
Holistic
Analytic
Single-point
Holistic
Rubrics
01.12.2025 Planning Assessment for Outcome-based Education 18
Advantages
• Describes multiple parameters in one point.
• Creating and grading consumes less time.
• Suitable for a significant number of
students.
• Reading takes less time.
Disadvantages
• Little feedback is provided.
• Not specific enough.
Analytic
Rubrics
01.12.2025 Planning Assessment for Outcome-based Education 19
Advantages
• Provide insight into the student’s strengths and
weaknesses.
• Great to get rich feedback.
• Best type of rubric for graded assignments.
• Provide transparency.
Disadvantages
• Time consuming to create.
• Takes time to read.
• Difficult to define criteria specifically enough.
Single-point
rubrics
01.12.2025 Planning Assessment for Outcome-based Education 20
Advantages
• Freedom for elaborate and personalized
feedback.
• Quick to read for students.
• Quick to create.
• Suitable for practicing feedback skills.
Disadvantages
• Providing feedback costs more time.
Below
expectations
Acceptable
standard
Exceeds
Expectation
…. …. ….
…. …. ….
0-X points X-Y points Y-Z points
Planning Assessment for Outcome-based Education 21
Rubric for Assessing Rubrics
Criteria 1. Unacceptable 2. Acceptable 3. Good/Solid 4. Exemplary
Clarity of criteria
Criteria being assessed are unclear,
inappropriate and/or have significant
overlap.
Criteria being assessed can be
identified, but are not clearly
differentiated or are inappropriate.
Criteria being assessed are clear,
appropriate and distinct between
levels.
Each criteria is distinct, clearly
delineated and fully appropriate for
the assignment(s)/course.
Distinction between Levels
Little/no distinction can be made
between levels of achievement.
Some distinction between levels is
made, but is not totally clear how
well.
Distinction is apparent.
Each level is distinct and progresses
in a clear and logical order.
Reliability of Scoring
Cross-scoring among faculty and/or
students often results in significant
differences.
Cross-scoring by faculty and/or
students occasionally produces
inconsistent results.
There is general agreement between
different scorers when using the
rubric (e.g. differs by less than 5-
10% or less than 1½ level).
Cross-scoring of assignments using
rubric results in consistent
agreement among scorers.
Clarity of Expectations/ Guidance
to Learners
Rubric is not shared with learners.
Rubric is shared and provides some
idea of the assignment/expectations.
Rubric is referenced used to
introduce an assignment/guide
learners.
Rubric serves as primary reference
point for discussion and guidance for
assignments as well as evaluation of
assignment(s).
Support of Metacognition
(Awareness of Learning)
Rubric is not shared with learners.
Rubric is shared but not
discussed/referenced with respect to
what is being learned through the
assignment(s)/course.
Rubric is shared and identified as a
tool for helping learners to
understand what they are learning
through the assignment/in the
course.
Rubric is regularly referenced and
used to help learners identify the
skills and knowledge they are
developing throughout the
course/assignment(s).
Engagement of Learners in Rubric
Development/ Use *
Learners are not engaged in either
development or use of the rubrics.
Learners offered the rubric and may
choose to use it for self assessment.
Learners discuss the design of the
rubric and offer feedback/input and
are responsible for use of rubrics in
peer and/or self-evaluation.
Faculty and learners are jointly
responsible for design of rubrics and
learners use them in peer and/or
self-evaluation.
01.12.2025
Source: Rubric for assessing Rubrics by Dr. Bonnie B. Mullinix, Monmouth University.
Setting the
minimum
score
• Relative Methods
• The cut-off score is determined by comparing
students’ results to each other. For example,
only the top 10% or those above the average
may pass.
• Advantages
• Automatically adjusts for external factors
affecting all students (e.g., noisy exam
room).
• Useful for selecting the best candidates
when there are limited spots.
• Disadvantages
• Passing depends on the group’s overall
performance, not on a fixed standard.
• If everyone performs poorly, many may
still pass.
• Year-to-year variations in student ability
can affect fairness.
01.12.2025 Planning Assessment for Outcome-based Education 22
Setting the
minimum
score
• Absolute Methods
• The cut-off score is set before the
assessment, based on the assessment’s
content (e.g., 60% of total marks to pass).
• Advantages
• Directly measures if students meet
learning objectives.
• Provides feedback on the
effectiveness of teaching.
• Disadvantages
• All students could pass or fail,
regardless of external circumstances.
• Requires high-quality, reliable
assessments.
01.12.2025 Planning Assessment for Outcome-based Education 23
Setting the
minimum
score
• Compromising Methods
• The cut-off score is set in advance but
can be adjusted based on pre-agreed
conditions (e.g., if too many students
fail).
• Advantages
• Balances fairness and flexibility.
• Adapts to unexpected assessment
outcomes.
• Disadvantages
• May introduce subjectivity if not
clearly defined in advance.
01.12.2025 Planning Assessment for Outcome-based Education 24
Question Types – A quick walk
01.12.2025 Planning Assessment for Outcome-based Education 25
Question Type Advantages Disadvantages
Instructor
Marked
• Good for assessing complex learning processes and
creativity
• Relatively easy to write
• Time-consuming to mark
• May require you to defend your marking scheme
Multiple Choice
• Easy to mark
• Easy to collect statistics from
• Good for assessing mastery of details and specific
knowledge
• Tests all lower levels of learning
• Time-consuming to develop good questions with
suitable distractors
• Hard to use for testing higher levels of knowledge
and skills
• Learners may tend to guess
Short Answer
• Fairly easy to develop
• Good for assessing mastery of details and specific
knowledge
• May take some time to develop, because you need to
identify any and all synonymous answers
True/False
• Easy to mark
• Easy to collect statistics from
• Good for assessing mastery of facts
• Hard to use for testing higher levels of knowledge
and skills
• Learners may tend to guess
• Difficult to create unequivocally true or false
statements
Source: Writing Effective Questions by The Learning Management Corporation.
This is a copyrighted content of The Learning Management Corporation and the author of this presentation does not have any ownership over the content.
Types of
Open-ended
Questions
01.12.2025 Planning Assessment for Outcome-based Education 26
Completion Items
Short Answer Questions
Long Answer Questions
Essay Questions
Ideal to assess
the higher levels
of Bloom’s
Taxonomy
Starting with
open-ended
questions
• Start with the model answer (helps to
formulate the right question).
• Be specific (number of arguments in
explanation and the number of examples)
• Provide enough information regarding the
length and format of the answer.
• The information part should be separated
from the question part.
• Formulate the question in positive terms.
• Check the time taken for completion.
01.12.2025 Planning Assessment for Outcome-based Education 27
Half a page – 10 mins
1 page – 25 mins
2 pages – 1 hour
*based on standard research literature
Group work/ Group project
• Are you a team player?
• Assess the product as well as the process.
• Students may be encouraged to write their own
group assessment criteria (only for aspirants, not
applicable to non-serious students).
• Some tweaks:
• Is the project suited for group work?
• Keep midsized groups
• Group roles
• Try solving real-world problems
• Build trust and open communication
01.12.2025 Planning Assessment for Outcome-based Education 28
Analysis
• Peer Review
• When should analysis happen???
• Take corrective measures before grading. (e.g. two
correct answers for MCQ)
• Helps in future improvement. (The student did not
grasp the component properly)
• Note:
• Performance data only provides indications of
quality.
• Be careful in drawing conclusions and making
changes. (look into the full picture always)
01.12.2025 Planning Assessment for Outcome-based Education 29
Evaluation
01.12.2025 Planning Assessment for Outcome-based Education 30
Get back to the utility
formula
Go back to the learning
goal/ learning outcome
and evaluate the
activities related to it
Student feedback
Closing the loop
Planning Assessment for Outcome-based Education 31
References and Web Sources
• In-Class Activities and Assessment for the Flipped Classroom
https://uwaterloo.ca/centre-for-teaching-excellence/catalogs/tip-sheets/class-activities-and-assessm
ent-flipped-classroom
• Rubistar http://rubistar.4teachers.org/index.php
• Rcampus https://www.rcampus.com/rubricshellc.cfm?mode=gallery&sms=publicrub
• Rubric Bank, University of Hawaii https://manoa.hawaii.edu/assessment/resources/rubric-bank/
• Recommended Online Course: Assessment in Higher Education: Professional Development
for Teachers offered by Erasmus University Rotterdam via Coursera
Attribution:
This presentation is a blend of my learning from professional experience, the online course
Assessment in Higher Education and the book The Lecturer’s Toolkit: A practical guide to
assessment, learning and teaching, authored by Phil Race.
01.12.2025
End of Presentation
Special thanks to the Organizing Team and the Patron of this FDP at Dr. NGP Institute of Technology

Planning Assessment for Outcome-based Education

  • 1.
    Planning Assessment for Outcome-basedEducation J.J. ADRI JOVIN, M.Tech., PhD, Dip. In Yoga, B.G.L., M.A. (Edu) Professor, Department of Information Technology, Sri Ramakrishna Institute of Technology This work is licensed under CC BY-NC-SA 4.0
  • 2.
    Overview Declaration: This isnot an AI-generated presentation. Considerable man-hours were spent on the preparation. 01.12.2025 Planning Assessment for Outcome-based Education 2 ASSESSMENT IN HIGHER EDUCATION ASSESSMENT AND GRADES ARE ASSESSMENTS RELIABLE? THE ASSESSMENT CYCLE PLANNING THE ASSESSMENT GRADING DESIGN CONSTRUCTING THE ASSESSMENT ANALYSIS AND EVALUATION
  • 3.
    Assessment in Higher Education •Drives student learning • Creates moments of reflection • Extent of Success • Level of competency students are expected to acquire for a course/ program 01.12.2025 Planning Assessment for Outcome-based Education 3
  • 4.
    Assessment and Grades • Poorestform of feedback about complex performances • Grades vs Workplace performance • Gives timely feedback on performance • Realize intended learning outcomes • A tangible material • May deviate with actual skills 01.12.2025 Planning Assessment for Outcome-based Education 4
  • 5.
    Values for Assessment 01.12.2025 PlanningAssessment for Outcome-based Education 5 Valid Reliable Transparent Authentic Motivate students to learn Inclusive Sufficiently diverse Formative even if it is intended to be summative Timely Incremental
  • 6.
    Effective Assessment • Assessment isused to engage students in learning that is productive. • Feedback is used to improve student learning. • Students and teachers become responsible partners in learning and assessment. • Students are inducted into the assessment practices and cultures of higher education. • Assessment is placed at the centre of the course and program design. • Assessment provides an inclusive and trustworthy representation of student achievement. 01.12.2025 Planning Assessment for Outcome-based Education 6
  • 7.
    Usability of Assessment 01.12.2025Planning Assessment for Outcome-based Education 7 Utility formula 5 parameters Reliability Validity Educational Effects Acceptance (of stakeholders) Cost (Efficiency) U = R V E A C ˣ ˣ ˣ ˣ Method of Assessment Reliability Validity Educational Effects Acceptance Cost Quiz + + +/- + + Assignment (Essay) - + + +/- -
  • 8.
    Planning Assessment forOutcome-based Education 8 Assessment Cycle 01.12.2025 Design •Learning Outcomes •Assessment •Activities Construction •Aligned Assessment •Grading Scheme, Answer Model and Feedback Plan Conducting •Where? •Allowed instruments •Time •Guidance, Proctor •Standard Protocols Grading and Feedback •Provide Insights Analysis •Student Performance •Teaching Quality •Assessment Quality Evaluation •Achieve Goals? •Desired Educational Effect? •Valid? •Reliable? Institutional Assessment Policy Vision on Education Framework of Principles Guidelines Possible Restrictions
  • 9.
    Assessment Plan 01.12.2025 Planning Assessmentfor Outcome-based Education 9 Align the outcomes with the methods of assessment The 5 factors Assigning methods of assessment, a relative weight Assessment Matrix
  • 10.
    Constructive Alignment 01.12.2025 Planning Assessmentfor Outcome-based Education 10 LEARNING OUTCOMES ACTIVITIES ASSESSMENTS Argue different theories Discuss theories in class Recognise different theories Multiple-choice quiz in class Checklist  Learning outcomes reflect levels of learning  Methods of assessment fit with learning objectives  Teaching activities should be aligned with assessments and learning activities Biggs J. Enhancing teaching through constructive alignment. Higher education. 1996 Oct;32(3):347-64.
  • 11.
    Assessment Plan 01.12.2025 PlanningAssessment for Outcome-based Education 11 What would be the ideal methods of assessment to assess the outcomes? (Note: take program outcomes into account.) 01 Consider the function of the assessment. (Use the utility formula. But this is not an easy task. Take more time to plan your assessments.) 02 Create assessment plan overview. 03 LO Importance of LO Method of Assessment Individua l Paper Assignm ent Exam LO1 20% x x LO2 30% x x LO3 30% x LO4 20% x x
  • 12.
    Planning Assessment forOutcome-based Education 12 Assessment Matrix • How many questions or assignments? • What type of questions or tasks? (Avoid formulating too many tasks aimed at the same content or skill) 01.12.2025 Level of cognition/ skills following Bloom’s Taxonomy LO Remember Understand Apply Analyse Evaluate Create Total Marks LO1 x x 20% LO2 x 10% LO3 x 30% Total Marks Level of cognition/ skills following Bloom’s Taxonomy LO Remember Understand Apply Analyse Evaluate Create Total Marks LO1 x x 32 LO2 x 16 LO3 x 48 Total Marks Translate Percentage to Mark Learning outcomes with multiple cognitive levels may elongate this process.
  • 13.
    Planning Assessment forOutcome-based Education 13 Assessment Matrix for Written Exam 01.12.2025 Level of cognition/ skills following Bloom’s Taxonomy LO Remember Understand Apply Analyse/ Evaluate/ Create Total Marks LO1 40 x 1 mark 40 LO2 10 x 2 marks 2 x 10 marks 40 LO3 6 x 8 marks 2 x 16 marks 80 Total Marks 40 marks 20 marks 80 marks 20 marks 160 Quiz Open-Ended Discussion
  • 14.
    Planning Assessment forOutcome-based Education 14 Assessment Matrix for Assignment 01.12.2025 Learning Outcome Grading Criteria Validity Reliability LO Assignment I Assignment II LO1 Criteria 1 LO2 Criteria 2 LO3 Criteria 3
  • 15.
    Grading Design 01.12.2025 PlanningAssessment for Outcome-based Education 15 Week Activity/ Assessment Method of Feedback Who provides Feedback? Material to Prepare 1 Lecture Automated Feedback via online quiz Moodle Online quiz and feedback Tutorial Peer review of first part of group assignment Peer Rubric Group Assignment Written Feedback Teacher Rubric Feedback Plan
  • 16.
    Rubric 01.12.2025 Planning Assessmentfor Outcome-based Education 16 Provide good feedback Communicate with students Clarification of criteria Level of correspondence between assessors
  • 17.
    Types of Rubrics 01.12.2025Planning Assessment for Outcome-based Education 17 Holistic Analytic Single-point
  • 18.
    Holistic Rubrics 01.12.2025 Planning Assessmentfor Outcome-based Education 18 Advantages • Describes multiple parameters in one point. • Creating and grading consumes less time. • Suitable for a significant number of students. • Reading takes less time. Disadvantages • Little feedback is provided. • Not specific enough.
  • 19.
    Analytic Rubrics 01.12.2025 Planning Assessmentfor Outcome-based Education 19 Advantages • Provide insight into the student’s strengths and weaknesses. • Great to get rich feedback. • Best type of rubric for graded assignments. • Provide transparency. Disadvantages • Time consuming to create. • Takes time to read. • Difficult to define criteria specifically enough.
  • 20.
    Single-point rubrics 01.12.2025 Planning Assessmentfor Outcome-based Education 20 Advantages • Freedom for elaborate and personalized feedback. • Quick to read for students. • Quick to create. • Suitable for practicing feedback skills. Disadvantages • Providing feedback costs more time. Below expectations Acceptable standard Exceeds Expectation …. …. …. …. …. …. 0-X points X-Y points Y-Z points
  • 21.
    Planning Assessment forOutcome-based Education 21 Rubric for Assessing Rubrics Criteria 1. Unacceptable 2. Acceptable 3. Good/Solid 4. Exemplary Clarity of criteria Criteria being assessed are unclear, inappropriate and/or have significant overlap. Criteria being assessed can be identified, but are not clearly differentiated or are inappropriate. Criteria being assessed are clear, appropriate and distinct between levels. Each criteria is distinct, clearly delineated and fully appropriate for the assignment(s)/course. Distinction between Levels Little/no distinction can be made between levels of achievement. Some distinction between levels is made, but is not totally clear how well. Distinction is apparent. Each level is distinct and progresses in a clear and logical order. Reliability of Scoring Cross-scoring among faculty and/or students often results in significant differences. Cross-scoring by faculty and/or students occasionally produces inconsistent results. There is general agreement between different scorers when using the rubric (e.g. differs by less than 5- 10% or less than 1½ level). Cross-scoring of assignments using rubric results in consistent agreement among scorers. Clarity of Expectations/ Guidance to Learners Rubric is not shared with learners. Rubric is shared and provides some idea of the assignment/expectations. Rubric is referenced used to introduce an assignment/guide learners. Rubric serves as primary reference point for discussion and guidance for assignments as well as evaluation of assignment(s). Support of Metacognition (Awareness of Learning) Rubric is not shared with learners. Rubric is shared but not discussed/referenced with respect to what is being learned through the assignment(s)/course. Rubric is shared and identified as a tool for helping learners to understand what they are learning through the assignment/in the course. Rubric is regularly referenced and used to help learners identify the skills and knowledge they are developing throughout the course/assignment(s). Engagement of Learners in Rubric Development/ Use * Learners are not engaged in either development or use of the rubrics. Learners offered the rubric and may choose to use it for self assessment. Learners discuss the design of the rubric and offer feedback/input and are responsible for use of rubrics in peer and/or self-evaluation. Faculty and learners are jointly responsible for design of rubrics and learners use them in peer and/or self-evaluation. 01.12.2025 Source: Rubric for assessing Rubrics by Dr. Bonnie B. Mullinix, Monmouth University.
  • 22.
    Setting the minimum score • RelativeMethods • The cut-off score is determined by comparing students’ results to each other. For example, only the top 10% or those above the average may pass. • Advantages • Automatically adjusts for external factors affecting all students (e.g., noisy exam room). • Useful for selecting the best candidates when there are limited spots. • Disadvantages • Passing depends on the group’s overall performance, not on a fixed standard. • If everyone performs poorly, many may still pass. • Year-to-year variations in student ability can affect fairness. 01.12.2025 Planning Assessment for Outcome-based Education 22
  • 23.
    Setting the minimum score • AbsoluteMethods • The cut-off score is set before the assessment, based on the assessment’s content (e.g., 60% of total marks to pass). • Advantages • Directly measures if students meet learning objectives. • Provides feedback on the effectiveness of teaching. • Disadvantages • All students could pass or fail, regardless of external circumstances. • Requires high-quality, reliable assessments. 01.12.2025 Planning Assessment for Outcome-based Education 23
  • 24.
    Setting the minimum score • CompromisingMethods • The cut-off score is set in advance but can be adjusted based on pre-agreed conditions (e.g., if too many students fail). • Advantages • Balances fairness and flexibility. • Adapts to unexpected assessment outcomes. • Disadvantages • May introduce subjectivity if not clearly defined in advance. 01.12.2025 Planning Assessment for Outcome-based Education 24
  • 25.
    Question Types –A quick walk 01.12.2025 Planning Assessment for Outcome-based Education 25 Question Type Advantages Disadvantages Instructor Marked • Good for assessing complex learning processes and creativity • Relatively easy to write • Time-consuming to mark • May require you to defend your marking scheme Multiple Choice • Easy to mark • Easy to collect statistics from • Good for assessing mastery of details and specific knowledge • Tests all lower levels of learning • Time-consuming to develop good questions with suitable distractors • Hard to use for testing higher levels of knowledge and skills • Learners may tend to guess Short Answer • Fairly easy to develop • Good for assessing mastery of details and specific knowledge • May take some time to develop, because you need to identify any and all synonymous answers True/False • Easy to mark • Easy to collect statistics from • Good for assessing mastery of facts • Hard to use for testing higher levels of knowledge and skills • Learners may tend to guess • Difficult to create unequivocally true or false statements Source: Writing Effective Questions by The Learning Management Corporation. This is a copyrighted content of The Learning Management Corporation and the author of this presentation does not have any ownership over the content.
  • 26.
    Types of Open-ended Questions 01.12.2025 PlanningAssessment for Outcome-based Education 26 Completion Items Short Answer Questions Long Answer Questions Essay Questions Ideal to assess the higher levels of Bloom’s Taxonomy
  • 27.
    Starting with open-ended questions • Startwith the model answer (helps to formulate the right question). • Be specific (number of arguments in explanation and the number of examples) • Provide enough information regarding the length and format of the answer. • The information part should be separated from the question part. • Formulate the question in positive terms. • Check the time taken for completion. 01.12.2025 Planning Assessment for Outcome-based Education 27 Half a page – 10 mins 1 page – 25 mins 2 pages – 1 hour *based on standard research literature
  • 28.
    Group work/ Groupproject • Are you a team player? • Assess the product as well as the process. • Students may be encouraged to write their own group assessment criteria (only for aspirants, not applicable to non-serious students). • Some tweaks: • Is the project suited for group work? • Keep midsized groups • Group roles • Try solving real-world problems • Build trust and open communication 01.12.2025 Planning Assessment for Outcome-based Education 28
  • 29.
    Analysis • Peer Review •When should analysis happen??? • Take corrective measures before grading. (e.g. two correct answers for MCQ) • Helps in future improvement. (The student did not grasp the component properly) • Note: • Performance data only provides indications of quality. • Be careful in drawing conclusions and making changes. (look into the full picture always) 01.12.2025 Planning Assessment for Outcome-based Education 29
  • 30.
    Evaluation 01.12.2025 Planning Assessmentfor Outcome-based Education 30 Get back to the utility formula Go back to the learning goal/ learning outcome and evaluate the activities related to it Student feedback Closing the loop
  • 31.
    Planning Assessment forOutcome-based Education 31 References and Web Sources • In-Class Activities and Assessment for the Flipped Classroom https://uwaterloo.ca/centre-for-teaching-excellence/catalogs/tip-sheets/class-activities-and-assessm ent-flipped-classroom • Rubistar http://rubistar.4teachers.org/index.php • Rcampus https://www.rcampus.com/rubricshellc.cfm?mode=gallery&sms=publicrub • Rubric Bank, University of Hawaii https://manoa.hawaii.edu/assessment/resources/rubric-bank/ • Recommended Online Course: Assessment in Higher Education: Professional Development for Teachers offered by Erasmus University Rotterdam via Coursera Attribution: This presentation is a blend of my learning from professional experience, the online course Assessment in Higher Education and the book The Lecturer’s Toolkit: A practical guide to assessment, learning and teaching, authored by Phil Race. 01.12.2025
  • 32.
    End of Presentation Specialthanks to the Organizing Team and the Patron of this FDP at Dr. NGP Institute of Technology

Editor's Notes

  • #7 Reliability - repeated administrations of an assessment instrument yield the same results. Reliability is influenced by the length of an exam, the number and the diversity of questions in an exam, the transparency of an assessment, and the clarity of the language used, as well as the clarity of expectations of both students and the teacher.  Validity - does the assessment measure what we want to measure? E.g.: If a multiple-choice question is used to repair an inflated tyre, it is considered invalid. Educational Effects - to what extent does the assessment influence the learning process, and does it have a positive or an adverse effect? Acceptance - All key stakeholders must accept the chosen assessment methodology. Cost – Total cost involved such as cost for the staff, technology etc. Let's say, the main function of the assessment we need is giving the learner good feedback on his learning outcomes during the learning process. Looking at our bottles, this would mean we'd focus on educational effects and validity, and we might just accept a slightly lower level of reliability and acceptance.  Knowing we still have other ways to specify our assessment findings, we can still work on acceptability during the learning process. And this might be the profile we're looking for in choosing an assessment method. But now, let's assume we need to choose an assessment which offers us the possibility to select the students, we'll be offered to continue to program, and send away the rest of the students. Now in this case, we would need a much better field reliability bottle. The validity bottle can suffer though, but educational effects and cost efficiency might be the bottles to draw some water from. So for this assessment with selection purposes, our assessment methodology might have a profile something like this.
  • #8 You are now aware of the different function of assessments, and you were able to apply the principles of assessment for learning. Now, it is up to you to start designing your own assessments. But, where to start? In order to help you design a high quality assessment, the different steps of the assessment cycle provide a strong handhold.  Before you start, you should check whether your institute has a specific assessment policy, or even better. Is there a vision on education and the role of assessment? The assessment policy and vision on education, will definitely impact your assessment. They'll provide you with a framework of principles, guidelines, and possible restrictions. With that framework in mind, you're ready to start the assessment circle. The first step is design. This means making an assessment plan for your course, and deciding on the methods of assessment.  Do you remember the concept of constructive alignment? Keep your learning goals and the learning activities close, as your assessment cannot be seen loose from those other sides of the triangle. The next step is assessment construction, actually developing the assessment tasks or questions. The results of this phase should be an assessment complete with grading scheme, answer model, and feedback plan. Of course, at a certain point, the assessment will be conducted. Beforehand, carefully consider all the conditions under which the assessment takes place. For instance, where will it take place?  What are the students allowed and not allowed to use? How much time do they have? What guidance do they need? Will there be examine proctors? Is there a standard protocol that applies? Don't forget to provide the necessary instructions to everyone involved. When the assessment has been conducted, it's time for grading and feedback.  Keep your grading scheme and answer model close. Remember that feedback is key for learning, and for maximizing the educational effect. Your students will always welcome any feedback provided instead of a naked grade, enabling them to gain more insights in their learning process. We're almost there, we're not finished just yet. In the second to the last phase, you want to analyse all assessment results, and look into your students' performance on the different tasks or questions. This will provide you as a teacher, with invaluable information. Not only about your students' performance, but on the quality of your assessment as well as your own teaching.  On which aspects did students perform well, and where did they fail? If good students fail on certain questions, what does this say about the question or about your teaching? You may need to decide on how to fix certain errors or unexpected situations, and think about how this impact students grades. We all help you in analyzing your assessment results during final week of the MOOC. And finally, using your analysis, the feedback you receive from your student and your own experiences with the assessment take time to reflect. Did you achieve your goals? Did the assessment have the desired educational effect?  Was it valid and reliable? You might want to alter or improve your assessment. You may even decide the next time round, you want to go about assessing your students learning in a totally different way. But with any new plans you make, never forget to keep your learning goals and learning activities in mind. And keep an eye on your institution or department's vision on assessment and education. And in that way, we're back to where we started. 
  • #17 Holistic – score the overall process without judging the component parts separately Analytic – assess different components of a product Single-point – one description of a standard level of performance for each defined category