Dr. Mrs. Mary Sulakshini Immanuel
CONTENTS
•Objectives
•Five key words
•Definition
•Place of Evaluation in
educational cycle
•Historical overview
• Purpose of Evaluation
• Role of Evaluation in
educational cycle
• Student Evaluation – What
for
• Characteristics of eva...
• Common defects of
examinations
• Selecting evaluation
instruments
• Test Blue Print Design
• Evaluation Tools – Methods
...
• Characteristics of constructive
feedback
• Facilitating behavior of an
Instructor
• Interference behavior
• Problems of ...
• Helping yourself learn from the
test
• Returning test papers
• Dealing with an agrieved student
• What do you do with st...
OBJECTIVES
• Update knowledge regarding
various evaluation strategies
• Understand different tools and
techniques used in evaluation
...
FIVE KEY QUESTIONS
•What
•Why
•How
•When
•Who
DEFINITIONS
E – Value
It is a systematic process of
determining the extent to which
instructional objectives and
qualitative descripti...
Evaluation is a process of
determining the extent
to which instructional
objectives are achieved
by learner.
PLACE OF
EVALUATION
IN
EDUCATIONAL
CYCLE
DEFINITION OF
OBJECTIVES
PROGRAM
IMPLEMENTATION
EVALUATION
HISTORICAL OVER
VIEW
Examinations and statistics were done 400 BC
1873 :- Nurse Report Cards.
Concern over Evaluation.
1921:- Efficiency report...
1940:- World War II – disuse of
checklist
1959:- Self Evaluation
1963:- Critical Incident Report
1970:- Behavior Checklist...
WHY!
THE PURPOSE
OF
EVALUATION
• Diagnosis
• Prediction
• Grading
• Selection
• Guidance
• Exploration
• Evaluation of teaching
• Motivation
ROLE OF
EVALUATION IN
TEACHING
1.Contributes directly to teaching
2.Pre assess the Learners needs
3.Provide Relevant teaching
4.Evaluate the Intended out...
STUDENT
EVALUATION
-
WHAT FOR
1.Incentive to learn, motivation
2.Feedback to students
3.Modification of learning activities
4.Selection of students
5.Su...
CHARACTERIS
TICS
OF
EVALUATION
1.Appropriateness
2.Effectiveness
3.Practicability
4.Ease administering
5.Ease of Scoring
6.Ease of Interpretation
7. Continuity
8. Objectivity
9. Relevance
10. Test usefulness
11. Precise and clear
12.Adequacy and balanced
13. Utility
14. Comparability
15. Validity
16. Reliability
17. Equilibrium
18. Equity
19. Specificity
20. Time
21. Length
PROCESS
OF
EVALUATION
• Identify the purpose of Evaluation
• Identify the time frame
• Determining when to Evaluate
• Select the Evaluators
• Ch...
• Select the Evaluation instruments
• Collect data
• Interpret data
• Report the findings
• Consider the cost of evaluation
COMMON DEFECTS
OF
EXAMINATIONS
• Triviality
• Outright error
• Ambiguity
• Obsolescence
• Bias
• Complexity
• Unintended clues
• Conservatism
SELECTING EVALUATION
INSTRUMENTS
• Measurement purpose
• Inference level
• Validity + Reliability
• Feasibility
• Effect on students
TEST BLUE PRINT -
DESIGNING
 Consider
oObjectives of the course, major
concepts
oThe duration of Teaching
oLevel of learn...
TEST
CONSTRUCTION
SPECIFICATION
TABLE
E.g. Fundamentals of Nursing 4, 5 and
6 units
Content Process Marks Weightage
Knowledge Comprehension Aplication Analysis ...
Decide:- How many questions,
weightage, to be given, how
much time for each questions in
one hour paper.
Teacher should ta...
EVALUATION TOOLS /
METHODS OF EVALUATION
Categorized into
Knowledge
Attitude
Skills
COGNITIVE/KNOWLEDGE
SUBJECTIVE TESTS
oEssay Type
oDescriptive or Narrative Type
OBJECTIVE TYPE
oMultiple choice question
o...
COGNITIVE/KNOWLEDGE
Problem Solving Type
oAssignment
oAnecdotal Records
oAchievement tests
oTeacher made tests
oStandardiz...
AFFECTIVE/ATTITUDES
oInterview
oAnecdotal records
oAttitude scales
oRating scales
oCommulative record
oSociometry
AFFECTIVE/ATTITUDES
o Projective Technique
oAptitude Tests
oObservational Technique
oGroup Discussion
oInteraction reports...
AFFECTIVE/ATTITUDES
oVideo taping
oCare Plans
oDebate
oPosition paper
oCritical incident
oSeminar presentation
oCritique p...
PSYCHO MOTOR/SKILLS
• Performance Appraisal
• Critical incident Technique
• Cumulative record
• OSCE or OSPE Methods
• Obs...
STUDENT EVALUATION
TRENDS
CONCERNING
THE EVALUATION
OF
CLINICAL SKILLS
Greater emphasis on
relating objectives to
evaluation.
More focus on student
learning as the function of
clinical evalua...
Increased attention to the
clinical evaluation process
as a vehicle for
instructional improvement
More involvement of th...
Provision for observers
training to improve reliability
among faculty evaluation
Increased use of stimulation
techniques...
Renewed efforts towards dealing
effectively with the issue of
grades and clinical evaluation
Combined clinical evaluatio...
CHARACTERISTIC
S OF
CONSTRUCTIVE
FEEDBACK
Feedback should be descriptive
rather than evaluative
Specific rather than general
Focus on behavior rather than on
per...
Amount of information is limited
to what recipient can use
Feedback be solicited rather than
imposed
Feedback can be ve...
FACILITATING
BEHAVIOR
OF AN
INSTRUCTOR
Positive feedback
Honest feedback
Constructive criticism
Clearly defined expectations
INTERFERENCE
BEHAVIORInsufficient feedback
Only negative feedback
Lack of clearly defined expectations
Late returning ...
PROBLEMS
OF
EVALUATION
Leniency errors Vs stringency
Halo Error
Recency error
Subjectivity
Errors of Central Tendency
Personal prejudice
M...
HELPING STUDENTS
TO
LEARN FROM THE
TEST
Make them to study the
subjects.
Constructive feedback from
the teacher guides and
corrects.
Students see their progres...
HELPING
YOURSELF
LEARN
FROM THE TEST
Helps to diagnose student
weakness
Reveal areas of teaching failed to
achieve its purposes
Recognize students problems ...
RETURNING TEST
PAPERS
Discussion the test in worthwhile
use of time
Help students to assess their own
learning
Discuss ...
DEALING WITH
AN AGGRIEVED
STUDENT
Be calm
Listen to the complaints
If grade to be changed, do it
If not, help the student to find
alternative models of ...
What do you do about
the student who missed
the test?
Only take the average of the
test she has done..
ITEM ANALYSIS
PURPOSES
• This tells us;
1. How easy is the item, how many students
answered it right.
2. Is the item measuring the same ...
NOTICE TO ALL
TEACHERS
You are reminded that
evaluation of education
must begin with a clear
and meaningful definition
of ...
We don’t care how hard the
students tried, we don’t care
how close she got… Until she
can perform she must not be
certifie...
THANK YOU
Evaluation strategies conference final
Upcoming SlideShare
Loading in …5
×

Evaluation strategies conference final

571 views
478 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
571
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
24
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Evaluation strategies conference final

  1. 1. Dr. Mrs. Mary Sulakshini Immanuel
  2. 2. CONTENTS
  3. 3. •Objectives •Five key words •Definition •Place of Evaluation in educational cycle •Historical overview
  4. 4. • Purpose of Evaluation • Role of Evaluation in educational cycle • Student Evaluation – What for • Characteristics of evaluation • Process of Evaluation
  5. 5. • Common defects of examinations • Selecting evaluation instruments • Test Blue Print Design • Evaluation Tools – Methods • Trends concerning the Evaluation of clinical skills
  6. 6. • Characteristics of constructive feedback • Facilitating behavior of an Instructor • Interference behavior • Problems of Evaluation • Helping students of learn from the Test
  7. 7. • Helping yourself learn from the test • Returning test papers • Dealing with an agrieved student • What do you do with student who missed the test • Notice to all teachers
  8. 8. OBJECTIVES
  9. 9. • Update knowledge regarding various evaluation strategies • Understand different tools and techniques used in evaluation • Identify different evaluation strategies and their application in different settings • Discuss basic item analysis and sociometry
  10. 10. FIVE KEY QUESTIONS •What •Why •How •When •Who
  11. 11. DEFINITIONS
  12. 12. E – Value It is a systematic process of determining the extent to which instructional objectives and qualitative descriptions of learners and value judgment concerning the desirability of that behavior
  13. 13. Evaluation is a process of determining the extent to which instructional objectives are achieved by learner.
  14. 14. PLACE OF EVALUATION IN EDUCATIONAL CYCLE
  15. 15. DEFINITION OF OBJECTIVES PROGRAM IMPLEMENTATION EVALUATION
  16. 16. HISTORICAL OVER VIEW
  17. 17. Examinations and statistics were done 400 BC 1873 :- Nurse Report Cards. Concern over Evaluation. 1921:- Efficiency reports. Students detained on ‘ Whims’ . Long and tortured History of clinical evaluation. 1930:- Check lists – Personality Assessment.
  18. 18. 1940:- World War II – disuse of checklist 1959:- Self Evaluation 1963:- Critical Incident Report 1970:- Behavior Checklist 1980:- Competency based 1992:- Formative Summative Evaluation
  19. 19. WHY! THE PURPOSE OF EVALUATION
  20. 20. • Diagnosis • Prediction • Grading • Selection • Guidance • Exploration • Evaluation of teaching • Motivation
  21. 21. ROLE OF EVALUATION IN TEACHING
  22. 22. 1.Contributes directly to teaching 2.Pre assess the Learners needs 3.Provide Relevant teaching 4.Evaluate the Intended outcome 5.Progress of the student 6.To improve future learning experience
  23. 23. STUDENT EVALUATION - WHAT FOR
  24. 24. 1.Incentive to learn, motivation 2.Feedback to students 3.Modification of learning activities 4.Selection of students 5.Success or failure 6.Feedback to teacher 7.School – Public relations 8.Protection of society
  25. 25. CHARACTERIS TICS OF EVALUATION
  26. 26. 1.Appropriateness 2.Effectiveness 3.Practicability 4.Ease administering 5.Ease of Scoring 6.Ease of Interpretation
  27. 27. 7. Continuity 8. Objectivity 9. Relevance 10. Test usefulness 11. Precise and clear
  28. 28. 12.Adequacy and balanced 13. Utility 14. Comparability 15. Validity 16. Reliability
  29. 29. 17. Equilibrium 18. Equity 19. Specificity 20. Time 21. Length
  30. 30. PROCESS OF EVALUATION
  31. 31. • Identify the purpose of Evaluation • Identify the time frame • Determining when to Evaluate • Select the Evaluators • Choose an Evaluation design/ model
  32. 32. • Select the Evaluation instruments • Collect data • Interpret data • Report the findings • Consider the cost of evaluation
  33. 33. COMMON DEFECTS OF EXAMINATIONS
  34. 34. • Triviality • Outright error • Ambiguity • Obsolescence • Bias • Complexity • Unintended clues • Conservatism
  35. 35. SELECTING EVALUATION INSTRUMENTS
  36. 36. • Measurement purpose • Inference level • Validity + Reliability • Feasibility • Effect on students
  37. 37. TEST BLUE PRINT - DESIGNING  Consider oObjectives of the course, major concepts oThe duration of Teaching oLevel of learning outcome oWeightage and points
  38. 38. TEST CONSTRUCTION SPECIFICATION TABLE
  39. 39. E.g. Fundamentals of Nursing 4, 5 and 6 units Content Process Marks Weightage Knowledge Comprehension Aplication Analysis Synthesis Evaluation Total Unit 4 10 12 10 4 4 10 50 Unit 5 5 6 5 2 2 5 25 Unit 6 5 6 5 2 2 5 25 20 24 20 8 8 20 100
  40. 40. Decide:- How many questions, weightage, to be given, how much time for each questions in one hour paper. Teacher should take the test and allow three times more time to the students
  41. 41. EVALUATION TOOLS / METHODS OF EVALUATION Categorized into Knowledge Attitude Skills
  42. 42. COGNITIVE/KNOWLEDGE SUBJECTIVE TESTS oEssay Type oDescriptive or Narrative Type OBJECTIVE TYPE oMultiple choice question oSingle choice question oMatching oTrue – False oFill in the blanks
  43. 43. COGNITIVE/KNOWLEDGE Problem Solving Type oAssignment oAnecdotal Records oAchievement tests oTeacher made tests oStandardized tests oSimulation
  44. 44. AFFECTIVE/ATTITUDES oInterview oAnecdotal records oAttitude scales oRating scales oCommulative record oSociometry
  45. 45. AFFECTIVE/ATTITUDES o Projective Technique oAptitude Tests oObservational Technique oGroup Discussion oInteraction reports oLog Book
  46. 46. AFFECTIVE/ATTITUDES oVideo taping oCare Plans oDebate oPosition paper oCritical incident oSeminar presentation oCritique presentation
  47. 47. PSYCHO MOTOR/SKILLS • Performance Appraisal • Critical incident Technique • Cumulative record • OSCE or OSPE Methods • Observational Technique • Self Report • Anecdotal notes
  48. 48. STUDENT EVALUATION
  49. 49. TRENDS CONCERNING THE EVALUATION OF CLINICAL SKILLS
  50. 50. Greater emphasis on relating objectives to evaluation. More focus on student learning as the function of clinical evaluation.
  51. 51. Increased attention to the clinical evaluation process as a vehicle for instructional improvement More involvement of the students in clinical evaluation
  52. 52. Provision for observers training to improve reliability among faculty evaluation Increased use of stimulation techniques Use of patients records as instrument for clinical evaluation
  53. 53. Renewed efforts towards dealing effectively with the issue of grades and clinical evaluation Combined clinical evaluation methods for more comprehensive evaluation
  54. 54. CHARACTERISTIC S OF CONSTRUCTIVE FEEDBACK
  55. 55. Feedback should be descriptive rather than evaluative Specific rather than general Focus on behavior rather than on personality Feedback involves sharing of information rather than giving advice Well timed
  56. 56. Amount of information is limited to what recipient can use Feedback be solicited rather than imposed Feedback can be verified or checked to determine degree of agreement from others Avoid collusion
  57. 57. FACILITATING BEHAVIOR OF AN INSTRUCTOR
  58. 58. Positive feedback Honest feedback Constructive criticism Clearly defined expectations
  59. 59. INTERFERENCE BEHAVIORInsufficient feedback Only negative feedback Lack of clearly defined expectations Late returning papers
  60. 60. PROBLEMS OF EVALUATION
  61. 61. Leniency errors Vs stringency Halo Error Recency error Subjectivity Errors of Central Tendency Personal prejudice Mistakes Inaccurate appraisal
  62. 62. HELPING STUDENTS TO LEARN FROM THE TEST
  63. 63. Make them to study the subjects. Constructive feedback from the teacher guides and corrects. Students see their progress helps build self efficacy.
  64. 64. HELPING YOURSELF LEARN FROM THE TEST
  65. 65. Helps to diagnose student weakness Reveal areas of teaching failed to achieve its purposes Recognize students problems in understanding Improve the test
  66. 66. RETURNING TEST PAPERS Discussion the test in worthwhile use of time Help students to assess their own learning Discuss common errors and suggest strategies to avoid such problems Explain what answers are expected
  67. 67. DEALING WITH AN AGGRIEVED STUDENT
  68. 68. Be calm Listen to the complaints If grade to be changed, do it If not, help the student to find alternative models of study Ask them to write the complaint in a paragraph to be considered
  69. 69. What do you do about the student who missed the test? Only take the average of the test she has done..
  70. 70. ITEM ANALYSIS
  71. 71. PURPOSES • This tells us; 1. How easy is the item, how many students answered it right. 2. Is the item measuring the same thing as the rest of the test. 3. Does it discriminate between the good and weak students. 4. It gives feedback information to improve the items for future reuse. 5. Helps to eliminate defective items.
  72. 72. NOTICE TO ALL TEACHERS You are reminded that evaluation of education must begin with a clear and meaningful definition of its objectives
  73. 73. We don’t care how hard the students tried, we don’t care how close she got… Until she can perform she must not be certified as being able to perform. R.F. Mager
  74. 74. THANK YOU

×