Your SlideShare is downloading. ×
  • Like
Evaluation strategies conference final
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Evaluation strategies conference final

  • 399 views
Published

 

Published in Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
399
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
21
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Dr. Mrs. Mary Sulakshini Immanuel
  • 2. CONTENTS
  • 3. •Objectives •Five key words •Definition •Place of Evaluation in educational cycle •Historical overview
  • 4. • Purpose of Evaluation • Role of Evaluation in educational cycle • Student Evaluation – What for • Characteristics of evaluation • Process of Evaluation
  • 5. • Common defects of examinations • Selecting evaluation instruments • Test Blue Print Design • Evaluation Tools – Methods • Trends concerning the Evaluation of clinical skills
  • 6. • Characteristics of constructive feedback • Facilitating behavior of an Instructor • Interference behavior • Problems of Evaluation • Helping students of learn from the Test
  • 7. • Helping yourself learn from the test • Returning test papers • Dealing with an agrieved student • What do you do with student who missed the test • Notice to all teachers
  • 8. OBJECTIVES
  • 9. • Update knowledge regarding various evaluation strategies • Understand different tools and techniques used in evaluation • Identify different evaluation strategies and their application in different settings • Discuss basic item analysis and sociometry
  • 10. FIVE KEY QUESTIONS •What •Why •How •When •Who
  • 11. DEFINITIONS
  • 12. E – Value It is a systematic process of determining the extent to which instructional objectives and qualitative descriptions of learners and value judgment concerning the desirability of that behavior
  • 13. Evaluation is a process of determining the extent to which instructional objectives are achieved by learner.
  • 14. PLACE OF EVALUATION IN EDUCATIONAL CYCLE
  • 15. DEFINITION OF OBJECTIVES PROGRAM IMPLEMENTATION EVALUATION
  • 16. HISTORICAL OVER VIEW
  • 17. Examinations and statistics were done 400 BC 1873 :- Nurse Report Cards. Concern over Evaluation. 1921:- Efficiency reports. Students detained on ‘ Whims’ . Long and tortured History of clinical evaluation. 1930:- Check lists – Personality Assessment.
  • 18. 1940:- World War II – disuse of checklist 1959:- Self Evaluation 1963:- Critical Incident Report 1970:- Behavior Checklist 1980:- Competency based 1992:- Formative Summative Evaluation
  • 19. WHY! THE PURPOSE OF EVALUATION
  • 20. • Diagnosis • Prediction • Grading • Selection • Guidance • Exploration • Evaluation of teaching • Motivation
  • 21. ROLE OF EVALUATION IN TEACHING
  • 22. 1.Contributes directly to teaching 2.Pre assess the Learners needs 3.Provide Relevant teaching 4.Evaluate the Intended outcome 5.Progress of the student 6.To improve future learning experience
  • 23. STUDENT EVALUATION - WHAT FOR
  • 24. 1.Incentive to learn, motivation 2.Feedback to students 3.Modification of learning activities 4.Selection of students 5.Success or failure 6.Feedback to teacher 7.School – Public relations 8.Protection of society
  • 25. CHARACTERIS TICS OF EVALUATION
  • 26. 1.Appropriateness 2.Effectiveness 3.Practicability 4.Ease administering 5.Ease of Scoring 6.Ease of Interpretation
  • 27. 7. Continuity 8. Objectivity 9. Relevance 10. Test usefulness 11. Precise and clear
  • 28. 12.Adequacy and balanced 13. Utility 14. Comparability 15. Validity 16. Reliability
  • 29. 17. Equilibrium 18. Equity 19. Specificity 20. Time 21. Length
  • 30. PROCESS OF EVALUATION
  • 31. • Identify the purpose of Evaluation • Identify the time frame • Determining when to Evaluate • Select the Evaluators • Choose an Evaluation design/ model
  • 32. • Select the Evaluation instruments • Collect data • Interpret data • Report the findings • Consider the cost of evaluation
  • 33. COMMON DEFECTS OF EXAMINATIONS
  • 34. • Triviality • Outright error • Ambiguity • Obsolescence • Bias • Complexity • Unintended clues • Conservatism
  • 35. SELECTING EVALUATION INSTRUMENTS
  • 36. • Measurement purpose • Inference level • Validity + Reliability • Feasibility • Effect on students
  • 37. TEST BLUE PRINT - DESIGNING  Consider oObjectives of the course, major concepts oThe duration of Teaching oLevel of learning outcome oWeightage and points
  • 38. TEST CONSTRUCTION SPECIFICATION TABLE
  • 39. E.g. Fundamentals of Nursing 4, 5 and 6 units Content Process Marks Weightage Knowledge Comprehension Aplication Analysis Synthesis Evaluation Total Unit 4 10 12 10 4 4 10 50 Unit 5 5 6 5 2 2 5 25 Unit 6 5 6 5 2 2 5 25 20 24 20 8 8 20 100
  • 40. Decide:- How many questions, weightage, to be given, how much time for each questions in one hour paper. Teacher should take the test and allow three times more time to the students
  • 41. EVALUATION TOOLS / METHODS OF EVALUATION Categorized into Knowledge Attitude Skills
  • 42. COGNITIVE/KNOWLEDGE SUBJECTIVE TESTS oEssay Type oDescriptive or Narrative Type OBJECTIVE TYPE oMultiple choice question oSingle choice question oMatching oTrue – False oFill in the blanks
  • 43. COGNITIVE/KNOWLEDGE Problem Solving Type oAssignment oAnecdotal Records oAchievement tests oTeacher made tests oStandardized tests oSimulation
  • 44. AFFECTIVE/ATTITUDES oInterview oAnecdotal records oAttitude scales oRating scales oCommulative record oSociometry
  • 45. AFFECTIVE/ATTITUDES o Projective Technique oAptitude Tests oObservational Technique oGroup Discussion oInteraction reports oLog Book
  • 46. AFFECTIVE/ATTITUDES oVideo taping oCare Plans oDebate oPosition paper oCritical incident oSeminar presentation oCritique presentation
  • 47. PSYCHO MOTOR/SKILLS • Performance Appraisal • Critical incident Technique • Cumulative record • OSCE or OSPE Methods • Observational Technique • Self Report • Anecdotal notes
  • 48. STUDENT EVALUATION
  • 49. TRENDS CONCERNING THE EVALUATION OF CLINICAL SKILLS
  • 50. Greater emphasis on relating objectives to evaluation. More focus on student learning as the function of clinical evaluation.
  • 51. Increased attention to the clinical evaluation process as a vehicle for instructional improvement More involvement of the students in clinical evaluation
  • 52. Provision for observers training to improve reliability among faculty evaluation Increased use of stimulation techniques Use of patients records as instrument for clinical evaluation
  • 53. Renewed efforts towards dealing effectively with the issue of grades and clinical evaluation Combined clinical evaluation methods for more comprehensive evaluation
  • 54. CHARACTERISTIC S OF CONSTRUCTIVE FEEDBACK
  • 55. Feedback should be descriptive rather than evaluative Specific rather than general Focus on behavior rather than on personality Feedback involves sharing of information rather than giving advice Well timed
  • 56. Amount of information is limited to what recipient can use Feedback be solicited rather than imposed Feedback can be verified or checked to determine degree of agreement from others Avoid collusion
  • 57. FACILITATING BEHAVIOR OF AN INSTRUCTOR
  • 58. Positive feedback Honest feedback Constructive criticism Clearly defined expectations
  • 59. INTERFERENCE BEHAVIORInsufficient feedback Only negative feedback Lack of clearly defined expectations Late returning papers
  • 60. PROBLEMS OF EVALUATION
  • 61. Leniency errors Vs stringency Halo Error Recency error Subjectivity Errors of Central Tendency Personal prejudice Mistakes Inaccurate appraisal
  • 62. HELPING STUDENTS TO LEARN FROM THE TEST
  • 63. Make them to study the subjects. Constructive feedback from the teacher guides and corrects. Students see their progress helps build self efficacy.
  • 64. HELPING YOURSELF LEARN FROM THE TEST
  • 65. Helps to diagnose student weakness Reveal areas of teaching failed to achieve its purposes Recognize students problems in understanding Improve the test
  • 66. RETURNING TEST PAPERS Discussion the test in worthwhile use of time Help students to assess their own learning Discuss common errors and suggest strategies to avoid such problems Explain what answers are expected
  • 67. DEALING WITH AN AGGRIEVED STUDENT
  • 68. Be calm Listen to the complaints If grade to be changed, do it If not, help the student to find alternative models of study Ask them to write the complaint in a paragraph to be considered
  • 69. What do you do about the student who missed the test? Only take the average of the test she has done..
  • 70. ITEM ANALYSIS
  • 71. PURPOSES • This tells us; 1. How easy is the item, how many students answered it right. 2. Is the item measuring the same thing as the rest of the test. 3. Does it discriminate between the good and weak students. 4. It gives feedback information to improve the items for future reuse. 5. Helps to eliminate defective items.
  • 72. NOTICE TO ALL TEACHERS You are reminded that evaluation of education must begin with a clear and meaningful definition of its objectives
  • 73. We don’t care how hard the students tried, we don’t care how close she got… Until she can perform she must not be certified as being able to perform. R.F. Mager
  • 74. THANK YOU