It is a process of making judgment about the
extent to which a particular educational
program achieved its objective and is also
measuring the extent to which a program
delivered is effective and efficient in full filling
its intended purpose of its development or
creation.
• To measure the progress of a program
• To identify any problem and to resolve
any conflicts
• To enhance utilization of available
resources
• To provide baseline information for
future evaluation and planning
• To modify and to make any remedial
measures
•Philosophy and objectives of the college
• Admission criteria
• Staff welfare activities
• Faculty position
• Curriculum
• Student's performance
• Infrastructural facilities
• Records and its maintenance
 Consistency with the objectives
 Comprehensiveness
 Sufficient diagnostic value
 validity
 Unity of evaluation judgment
 Continuity
 1 ) Context Evaluation
 It provides the information pertaining to the
environment in which the evaluation is being
carried out.
2 ) Input Evaluation
 This is the next stage of evaluation. Input
evaluation mainly evaluates men, money,
material and policies to implement goals.
3)Process evaluation
 monitors the strategy adopted to implement
the program and gets constant feedback and
also identifies the defects in the program.
4) Product and Outcome Evaluation
 Finally the product evaluation looks into the
final product, i.e. whether the curriculum has
accomplished the desired objectives.
 Product evaluation provides data to match with
the mission and goal of the program.
3 ) Eisner’s Connoisseurship Model
This is one of the qualitative models of
evaluation. Unlike quantitative models such as
Tylerian model, CIPP model, this model does
not collect data. The evaluation mainly
narrates, describes and makes a through
portrayal of the event or situation
 Evaluation, an integral part of an educational
program, is the process of judging the
effectiveness of educational experience
through careful appraisal.
 According to Ralph Tyler (1950), Evaluation is
the process of determining to what extent the
educational objectives are being realized.
 Measurement : A quantitative process involving
the assigning of a number to an individual's
characteristics.
 Assessment : A term used instead of
measurement when a numerical value is not
involved, e.g. checklists of behaviors.
 Evaluation : The process of judging the value or
worth of an individual's characteristics obtained
by measurement or assessment.
 Test : An instrument or tool for obtaining
measurements or assessments, e.g. an essay.
 Examination : A formal situation in which
students undertake one or more tests under
specific rules.
 Evaluation, in a general sense, is a process of
assessment.
 Evaluation is a continuous process.
 It is a systematic process Evaluation differs
from measurement.
 It is an integral part of education.
 Evaluation takes its direction from a definition of
education which, stated in its broadest
sense, is to enable students to realize their
potential as human beings.
 Evaluation is a means to an end and never an
end in itself
 Evaluation, by definition, connotes value.
 Evaluation involves teacher judgment.
 Validity and reliability are of paramount concern
in any evaluation activity.
 Evaluation is an integral part of the teaching-
learning process
 Every evaluation should be made with reference
to specified outcomes.
 Evaluation procedures should take into
consideration individual differences among
students.
 Evaluation of students involves more than a
single appraisal at any one time.
 The process of evaluation begins with the
outcomes of educational the program.
 Formative evaluation is done during an
instructional program.
 The instructional program should aim at the
attainment certain objectives during the
implementation of the program.
 It is done to monitor learning and modifying
the program if needed before its completion.
 Formative evaluation is for current students
 Summative evaluation is done at the end or
completion of a particular instructional
program
 whose duration may vary from a semester to
whole year and to know whether the student is
competent enough for certification.
 Since the result is usually in the form of a
single total score or a pass or fail, there is
little scope of meaningful feedback either to
the learner or to the teacher
 The terms criterion-referenced and norm-
referenced were originally coined by Robert
Glaser.
 A criterion-referenced test is one that provides
for translating test scores into a statement
about the behavior
 Example, the criterion may be "Students should
be able to correctly add two single-digit
numbers", and the cutscore may be that
students should correctly answer a minimum of
80% of the questions to pass.
 A norm-referenced test / NRT is a type of
test, assessment, or evaluation which yields
an estimate of the position of the tested
individual in a predefined population, with
respect to the trait being measured
Evaluation by teacher through teacher made and
standardized tests, observation and other
techniques is called for and this is known as
internal assessment.
6) External Examination
External examination or assessment is organized
and conducted by an external agency, other than
the college. The below mentioned question helps
to gain insight in conducting external
examination
1)Essay Examinations
 The essay type examination seeks to measure
the integrated knowledge of the examinees
 Essay questions are supply or constructed
response type questions and can be the best
way to measure the student’s thinking skills
such
 applying,
 organizing,
 synthesizing,
 integrating,
 evaluating, or projecting
 It is short and direct questions and the
students are expected to answer in a word or
phrase or a numerical response to the
question.
Types of Short Answer Questions
 Asking for a definition.
 Asking to draw a diagram.
 Asking to complete an incomplete sentence.
 Asking for a unique answer to a direct
question.
 The objective tests seek to measure more
consistently and accurately the knowledge of
terms, concepts, vocabulary facts and
understanding and measure only what is
intended to be measured.
Types of Objective Type Test Item
 Multiple choice items/select the best answer
 True or false items
 matching type items
 Sentence completion items.
Validation and Banking of MCQs
Item analysis is the process of analyzing the
performance of a multiple choice item after it
has appeared in a question paper
B) TRUE OR FALSE ITEMS
These are question or statements followed by
Yes/No or True/False responses. The student
is asked to tick or mark the correct response.
 Only a single concept or idea.
 Write clear and direct statements.
 Avoid using clues
 Avoid tricky and catchy items.
 Have equal number of true and false items.
 Detectable pattern of answers should be
avoided (T, F, T,F).
 The statement should not be taken directly
from the text-book.
 The direction for answering the question
must be clear.
The matching type items are prepared in two
columns- one called as the stimulus column and the
other one called as the response column.
Principles for Preparation of Matching Items
 The statements should belong to the same kind
 The number of choices should be more than the
required answers.
 Too many items may be confusing and distracting.
 The stimuli and response columns should be on the
same page.
 The terminology in one column should not give clues
to the expected responses in the other column.
 Provide clear directions for matching.
 Arrange responses or both in alphabetical order
Observation can be defined as the direct
visualization of performance of a talk or
behavior. It involves watching students carry
out some activity or listening to pupils speech
reading and discussing things
 Anecdotal records can be defined as a brief description
of an observed behavior that appears significant for
evaluation purposes.
Characteristics of Anecdotal Records
 • It is a factual recording only of the actual event,
incident or observation
 • It is a record of only one incident.
 • It is a record of an incident which is considered
important and significant in the growth and
development of the student.
 • It should include:
- a description of a particular occasion - a delineation
of the behavior noted, indicating who, what, why,
when, where and how
 - the evaluator's opinion or estimate of the incident or
behavior.
Checklists
checklists are lists of items or performance
indicators requiring dichotomous responses
such as satisfactory or factory, pass or fail,
yes or no, present or absent, etc
Rating Scale
A rating scale can be defined as "a
standardized method of recording,
interpretation of behavior, which is totally
based on observation, strictly in line with the
educational objectives".
 Various types of rating scales that are
commonly used are:
 • Descriptive rating scales
 • Graphic rating scales
 • Numerical rating scales
 • Behaviorally anchored rating scales.
 The practical exams are meant to assess the
professional competence gained by the
students over a period of time and whether it
meets the requirements and expectations
specified by the statutory boards.
Knowledge is tested in theory examination and
skills are evaluated in clinical/practical
examination and oral examination is meant to
evaluate the following qualities:
 Depth of knowledge,
 Ability to discuss and
 Defend one's decisions, attitudes,
 Alertness,
 Ability to perform under stress
 Professional competence
 Objective structured practical examination
(OSPE) is a new pattern of practical
examination, in which each component of
clinical competence is tested component of
clinical competence is tested uniformly and
objectively for all the students who are taking
up a practical examination at a given place.
Objective structured practical examination (OSPE) is a new pattern of practical examination, in which each component of clinical compObjective structured practical examination (OSPE) is a new pattern of practical examination, in which each component of clinical comp
 OSCE includes series of 12 to 20 stations,
each testing one or two components of
clinical competencies for 3 to 5 minutes.
Students are rotated to all stations with pre
determined time interval, thus through the
series of 12 to 20 stations to accommodate
12 to 20 students, who will be examined
simultaneously
 An attitude scale measures
how the participant feels about a subject at
the moment when he or she answers the
question. Several popular types of attitude
scales are used in evaluation of nursing
education.
 Neeraja K P. Textbook Of Communication
And Education Technology For Nurses.1ST
edition. New Delhi: Jaypee Brothers Medical
Publishers (P) Ltd; 2011. PAGE NO: 214-230
 Sankaranarayanan B. Learning & Teaching
Nursing. Calicut: Brainfill Publications;2003
PAGE NO: 46-51
 R.Sudha Nursing education principles and
concepts ,jaypee publications 1st edition page
no ;332-340

Program evaluation

  • 1.
    It is aprocess of making judgment about the extent to which a particular educational program achieved its objective and is also measuring the extent to which a program delivered is effective and efficient in full filling its intended purpose of its development or creation.
  • 2.
    • To measurethe progress of a program • To identify any problem and to resolve any conflicts • To enhance utilization of available resources • To provide baseline information for future evaluation and planning • To modify and to make any remedial measures
  • 3.
    •Philosophy and objectivesof the college • Admission criteria • Staff welfare activities • Faculty position • Curriculum • Student's performance • Infrastructural facilities • Records and its maintenance
  • 4.
     Consistency withthe objectives  Comprehensiveness  Sufficient diagnostic value  validity  Unity of evaluation judgment  Continuity
  • 5.
     1 )Context Evaluation  It provides the information pertaining to the environment in which the evaluation is being carried out. 2 ) Input Evaluation  This is the next stage of evaluation. Input evaluation mainly evaluates men, money, material and policies to implement goals. 3)Process evaluation  monitors the strategy adopted to implement the program and gets constant feedback and also identifies the defects in the program.
  • 6.
    4) Product andOutcome Evaluation  Finally the product evaluation looks into the final product, i.e. whether the curriculum has accomplished the desired objectives.  Product evaluation provides data to match with the mission and goal of the program. 3 ) Eisner’s Connoisseurship Model This is one of the qualitative models of evaluation. Unlike quantitative models such as Tylerian model, CIPP model, this model does not collect data. The evaluation mainly narrates, describes and makes a through portrayal of the event or situation
  • 7.
     Evaluation, anintegral part of an educational program, is the process of judging the effectiveness of educational experience through careful appraisal.  According to Ralph Tyler (1950), Evaluation is the process of determining to what extent the educational objectives are being realized.
  • 8.
     Measurement :A quantitative process involving the assigning of a number to an individual's characteristics.  Assessment : A term used instead of measurement when a numerical value is not involved, e.g. checklists of behaviors.  Evaluation : The process of judging the value or worth of an individual's characteristics obtained by measurement or assessment.  Test : An instrument or tool for obtaining measurements or assessments, e.g. an essay.  Examination : A formal situation in which students undertake one or more tests under specific rules.
  • 9.
     Evaluation, ina general sense, is a process of assessment.  Evaluation is a continuous process.  It is a systematic process Evaluation differs from measurement.  It is an integral part of education.
  • 10.
     Evaluation takesits direction from a definition of education which, stated in its broadest sense, is to enable students to realize their potential as human beings.  Evaluation is a means to an end and never an end in itself  Evaluation, by definition, connotes value.  Evaluation involves teacher judgment.  Validity and reliability are of paramount concern in any evaluation activity.
  • 11.
     Evaluation isan integral part of the teaching- learning process  Every evaluation should be made with reference to specified outcomes.  Evaluation procedures should take into consideration individual differences among students.  Evaluation of students involves more than a single appraisal at any one time.  The process of evaluation begins with the outcomes of educational the program.
  • 12.
     Formative evaluationis done during an instructional program.  The instructional program should aim at the attainment certain objectives during the implementation of the program.  It is done to monitor learning and modifying the program if needed before its completion.  Formative evaluation is for current students
  • 13.
     Summative evaluationis done at the end or completion of a particular instructional program  whose duration may vary from a semester to whole year and to know whether the student is competent enough for certification.  Since the result is usually in the form of a single total score or a pass or fail, there is little scope of meaningful feedback either to the learner or to the teacher
  • 14.
     The termscriterion-referenced and norm- referenced were originally coined by Robert Glaser.  A criterion-referenced test is one that provides for translating test scores into a statement about the behavior  Example, the criterion may be "Students should be able to correctly add two single-digit numbers", and the cutscore may be that students should correctly answer a minimum of 80% of the questions to pass.
  • 15.
     A norm-referencedtest / NRT is a type of test, assessment, or evaluation which yields an estimate of the position of the tested individual in a predefined population, with respect to the trait being measured
  • 16.
    Evaluation by teacherthrough teacher made and standardized tests, observation and other techniques is called for and this is known as internal assessment. 6) External Examination External examination or assessment is organized and conducted by an external agency, other than the college. The below mentioned question helps to gain insight in conducting external examination
  • 17.
    1)Essay Examinations  Theessay type examination seeks to measure the integrated knowledge of the examinees  Essay questions are supply or constructed response type questions and can be the best way to measure the student’s thinking skills such  applying,  organizing,  synthesizing,  integrating,  evaluating, or projecting
  • 18.
     It isshort and direct questions and the students are expected to answer in a word or phrase or a numerical response to the question. Types of Short Answer Questions  Asking for a definition.  Asking to draw a diagram.  Asking to complete an incomplete sentence.  Asking for a unique answer to a direct question.
  • 19.
     The objectivetests seek to measure more consistently and accurately the knowledge of terms, concepts, vocabulary facts and understanding and measure only what is intended to be measured. Types of Objective Type Test Item  Multiple choice items/select the best answer  True or false items  matching type items  Sentence completion items.
  • 20.
    Validation and Bankingof MCQs Item analysis is the process of analyzing the performance of a multiple choice item after it has appeared in a question paper B) TRUE OR FALSE ITEMS These are question or statements followed by Yes/No or True/False responses. The student is asked to tick or mark the correct response.
  • 21.
     Only asingle concept or idea.  Write clear and direct statements.  Avoid using clues  Avoid tricky and catchy items.  Have equal number of true and false items.  Detectable pattern of answers should be avoided (T, F, T,F).  The statement should not be taken directly from the text-book.  The direction for answering the question must be clear.
  • 22.
    The matching typeitems are prepared in two columns- one called as the stimulus column and the other one called as the response column. Principles for Preparation of Matching Items  The statements should belong to the same kind  The number of choices should be more than the required answers.  Too many items may be confusing and distracting.  The stimuli and response columns should be on the same page.  The terminology in one column should not give clues to the expected responses in the other column.  Provide clear directions for matching.  Arrange responses or both in alphabetical order
  • 23.
    Observation can bedefined as the direct visualization of performance of a talk or behavior. It involves watching students carry out some activity or listening to pupils speech reading and discussing things
  • 24.
     Anecdotal recordscan be defined as a brief description of an observed behavior that appears significant for evaluation purposes. Characteristics of Anecdotal Records  • It is a factual recording only of the actual event, incident or observation  • It is a record of only one incident.  • It is a record of an incident which is considered important and significant in the growth and development of the student.  • It should include: - a description of a particular occasion - a delineation of the behavior noted, indicating who, what, why, when, where and how  - the evaluator's opinion or estimate of the incident or behavior.
  • 25.
    Checklists checklists are listsof items or performance indicators requiring dichotomous responses such as satisfactory or factory, pass or fail, yes or no, present or absent, etc Rating Scale A rating scale can be defined as "a standardized method of recording, interpretation of behavior, which is totally based on observation, strictly in line with the educational objectives".
  • 26.
     Various typesof rating scales that are commonly used are:  • Descriptive rating scales  • Graphic rating scales  • Numerical rating scales  • Behaviorally anchored rating scales.
  • 27.
     The practicalexams are meant to assess the professional competence gained by the students over a period of time and whether it meets the requirements and expectations specified by the statutory boards.
  • 28.
    Knowledge is testedin theory examination and skills are evaluated in clinical/practical examination and oral examination is meant to evaluate the following qualities:  Depth of knowledge,  Ability to discuss and  Defend one's decisions, attitudes,  Alertness,  Ability to perform under stress  Professional competence
  • 29.
     Objective structuredpractical examination (OSPE) is a new pattern of practical examination, in which each component of clinical competence is tested component of clinical competence is tested uniformly and objectively for all the students who are taking up a practical examination at a given place. Objective structured practical examination (OSPE) is a new pattern of practical examination, in which each component of clinical compObjective structured practical examination (OSPE) is a new pattern of practical examination, in which each component of clinical comp
  • 30.
     OSCE includesseries of 12 to 20 stations, each testing one or two components of clinical competencies for 3 to 5 minutes. Students are rotated to all stations with pre determined time interval, thus through the series of 12 to 20 stations to accommodate 12 to 20 students, who will be examined simultaneously
  • 31.
     An attitudescale measures how the participant feels about a subject at the moment when he or she answers the question. Several popular types of attitude scales are used in evaluation of nursing education.
  • 32.
     Neeraja KP. Textbook Of Communication And Education Technology For Nurses.1ST edition. New Delhi: Jaypee Brothers Medical Publishers (P) Ltd; 2011. PAGE NO: 214-230  Sankaranarayanan B. Learning & Teaching Nursing. Calicut: Brainfill Publications;2003 PAGE NO: 46-51  R.Sudha Nursing education principles and concepts ,jaypee publications 1st edition page no ;332-340