4. MARKS
As a number or letter indicating quality especially of a students
5. ITEM(In Education)
Tests are tools that are frequently used to facilitate
the evaluation process.
ANALYSIS
Analysis is a process which enters into research in
one form or another form the very beginning.
6. ď
Scoring: It means to evaluate
and assign a grade.
ďReport: A document containing
information organized in a
narrative, graphic, or tabular form,
prepared on periodic, recurring,
regular, or as required basis.
7. ADMINISTERING A TEST:
A teacher's test administration
procedures can have great impact
on student test performance.
ďśBefore the test
ďśAfter Distributing Test Papers
ďśDuring the Test
ďśAfter the Test
8. THE SCORING METHOD:
The point should be identified by a person with
expert knowledge of the topic in question, the expert must also
supply documentation and commentary on the item.
9. Six point graded scale,
0 â point is not mentioned in paper
1 â point is mentioned but is incorrectly states
2 â point is mentioned but is only partly correctly stated
3 â point is mentioned and is fully correctly stated
4 â point is correctly stated and is partly elaborated
5 â point is correctly stated and is fully elaborated.
10. TYPES OF SCORE
Raw Scores :
A Raw Score is simply the
number of questions a student
answers correctly for a test.
11. Uses:
ď
A raw score provides an
indication of the variability in
performance among students
in a classroom.
12. Limitations:
ď
A raw score by itself has no
meaning. It can be interpreted
only by comparing it with some
standard such as total number of
items for a test or with raw scores
earned by a comparison group.
13. Percentile Rank
ď
A percentile is a measure that
tells us what percent of the
total frequency scored at or
below that measure. A
percentile rank is the
percentage of scores that fall
at or below a given score.
16. Stanine (Standard nine)
ď
Standard nine) : Stanine scores
express test results in equal steps
that range from 1 (lowest) to 9
(highest). The average is a score
of 5. In general, stanine scores
1,2 & 3 are below average, 4,5 &
6 are average and 7, 8 & 9 are
above average
17. standard Scores
Types:
Z â Score
ď
If a mean and standard deviation
can be calculated for a given set
of raw scores, each raw scores
can be expressed in terms of its
distance from the mean in
standard deviation units or z â
scores.
18. Z â Score =
Raw Score âMean/Standard
deviation
Note: Z â score is always minus when
the raw score is smaller than the mean.
19. T Scores:
ď
any set of normally distributed
standard scores that has a
mean of 50 and SD of 10.
Multiplying the z â score by 10
and adding the product to 50 can
obtain T Scores.
21. GRADING
Definition and History of grading:
â˘Grading refers to the process of using symbols, such as letter
to indicate various types of students progress (Nitko 2001).
22. What is Grading
In the system of grading students are classified into a few
ability group or categories according to their level of achievement in
an examination .
24. GRADING SYSTEM
The grading system the students is evaluated on the
five or seven scale pattern and given a grade. Ex. O, A+, A, A-, B+,B ,B-
& c etc..
The grade is given according to a criteria decided by
faculty of the school as between 90 to 100 âOâ Outstanding, 90 to
81 âA+â Very good, 80 to 71 âAâ Good so on.
25. Common Methods of
Grading :
ď
Letter grades :
There is a great flexibility in
the number of grades that can
be adopted.
28. ď
Number/Percentage grades
(5, 3, 2, 1, 0) or (98%, 80%, 60%
etc.)
It is same as letter grades. Only
difference is that instead of letters
numbers of percentage is used.
29. Strengths:
ď
Easy to use
ď
Easy to interpret theoretically
ď
Provide a concise summary
ď
May be combined with letter
grades
ď
More continuous than letter
grades
32. SCORING ESSAY TESTS:
Improving objective and relevancy in
scoring tests is to prepare an ideal answer to
each essay question and to base the scoring
on relations between examinee answers and
the ideal answer.
33. Types of scoring Essay Test
There are two types of scoring standard in
subjective test:
ď§ Norm Referenced Test
ď§ Criterion referenced Test
34. Methods of scoring essay test
There are two methods of scoring essay test:
⢠Analytic method
⢠Holistic method
37. Point or Analytic Method
In this method, each answer is compared with
already prepared ideal making scheme
(scoring key ) and marks are assigned
according to the adequacy of the answer.
40. Procedure for Scoring Essay Questions
ď§ Using scoring method that is most appropriate
for the test item.
ď§ Decide how to handle factors that are irrelevant
to the learning outcomes like handwriting,
spelling, punctuation.
ď§ Prepare the ideal answer or outline of expected
answer immediately after constructing the test.
ď§ Make comments during the scoring of each essay
item. These comments act as feedback to
students.
41. Suggestion For Scoring Essay Type
Questions
ď§ Identify the method of scoring to be used prior to the testing
situation and inform the students of it
ď§ avoid fatigue
ď§ The identity of the examinees should be kept secret from the
examiner
42. â˘Read a random sample
â˘Use the same scoring system for all papers
â˘Cover the scores of the previous answers
44. ITEMANALYSIS
Item analysis is a statistical technique
which is used for selecting and rejecting the
items of the test on the basis of their difficulty
value and discriminated power
45. OBJECTIVES OF ITEM
ANALYSIS
To select appropriate items for the final draft
of all
To obtain the information about the difficulty value(D.V)
the items
To provide discriminatory power (D.I) to differentiate between
capable and less capable examinees for the items
To provide modification to be made in some of the items
To prepare the final draft properly ( easy to difficult items)
46. STEPS OF ITEMANAYSIS
Arrange the scores in descending order
Separate two sub groups of the test papers
Take 27% of the scores out of the highest scores and 27% of the
scores falling at bottom
Count the number of right answer in
highest group (R.H) and count the no of right
answer in lowest group (R.L)
Count the non-response (N.R) examinees
47. Item analysis is done for obtaining:
a) Difficulty value (D.V)
b) Discriminative power (D.P)
48. DIFFICULTY VALUE(D.V)
âThe difficulty value of an item is defined as the
proportion or percentage of the examinees who have
answered the item correctlyâ
- J.P. Guilford
49. The formula for difficulty
value (D.V)
D.V = (R.H + R.L)/ (N.H + N.L)
⢠R.H â rightly answered in highest group
⢠R.L - rightly answered in lowest group
⢠N.H â no of examinees in highest group
⢠N.L - no of examinees in lowest group
50. In case non-response examinees available
means,
The formula for difficulty value (D.V)
D.V = (R.H + R.L)/ [(N.H + N.L)- N.R]
⢠R.Hâ rightly answered in highest group
⢠R.L-rightly answered in lowest group
⢠N.H â no of examinees in highestgroup
⢠N.L-no of examinees in lowest group
⢠N.Râ no of non-response examinees
51. General guidelines for difficulty
value (D.V)
ďąLow difficulty value index means, that item is high difficultyone
⢠ex: D.V=0.20  20% only answered correctly for that item.
⢠So that item is too difficult
ďąHigh difficulty value index means, that item is easy one
⢠ex: D.V=0.80  80% answered correctly for that item. So that
item is too easy one
53. DISCRIMINATION INDEX (D.I)
âIndex of discrimination is that ability of an item on the
basis of which the discrimination is made between
superiors and inferiorsâ
-Blood and Budd (1972)
54. TYPES OF DISCRIMINATION
INDEX (D.I)
Zero discrimination or Nodiscrimination
Positivediscrimination
Negativediscrimination
55. The formula for
discrimination index(D.I)
D.I = (R.H - R.L)/ (N.H or N.L)
⢠R.H â rightly answered in highest group
⢠R.L - rightly answered in lowest group
⢠N.H â no of examinees in highest group
⢠N.L - no of examinees in lowest group
56. Another method for
discrimination index(D.I)
â˘If you take Likert scale for data collection
â˘The correlation between each item scores
with total scores.
57. General guidelines for
discriminating index (D.I)
According to Ebel ,
D.I Item Evaluation
âĽ0.40 Very good items
0.30 - 0.39 Reasonably good but subject to
improvement
0.20 â 0.29 Marginal items , need
improvement
<0.19 Poor items . Rejected or revised
58. Relationship between
difficulty value and
discrimination power
ITEM DIFFICULTY AND ITEM DISCRIMINATION
Item difficulty level and item discrimination level
is closely related to each other. The range of values of
discrimination index depends on the difficulty level of the item.
Example: Item with difficulty level zero or 1.00(0 or 1.00) always
discrimination indicates zero.
59. CRITERIA FOR SELECTION AND
REJECTION ITEMS
ďą Positive discrimination index only selected
ďą Negative and zero discrimination index items are
rejected
ďą High and low difficulty value items are rejected
60. REFERENCE
Sharma R.A (2007);Essential of Scientific Behavioural Research,
Meerut, Lall book depot.
Mehta D.D (2011);Educational Measurement and Evaluation,
Meerut, Tandon publication.
Kulbir Singh Sidhu (2005);New Approaches to Measurement and
Evaluation, New Delhi, Sterling Publishers.
Akbar Husain (2012);Psychological Testing, New Delhi, Pearson.
62. INTRODUCTION
⢠When question are framed with reference to
the objective of instruction, the test becomes
object is entered.
63. * What is Objective Test
* Categories of Objective Test
* Advantages and Disadvantages of
Objective Test
* Types of Objective Test
64. DEFINITION
Objective test item are items that can be objectively
scored items on which person select a response from
a list of option.
- By W.Wiersma&Gjurs,
67. Constructed Response Format
- Supply type
- Enumeration
- Labeling
- Identification
- Completion type
- Simple Recall
68. Types of Objective Test
- True or False
- Matching Type
- Multiple Choice
- Enumeration
- Labeling
- Identification
- Completion type
- Simple Recall
69. ⢠1. Short answer item
⢠Short answer item and completion item is supply type test
that can be answered by a word, phrase, number, or symbol
.they is essentially the same, differing only in the method of
explanation
Examples:
⢠Short answer- what is the name of the man who discovered
penicillin? (Alexander Fleming)
⢠Completion type: the name of the person who discovered
penicillinis (Alexander Fleming)
70. Suggestions:
⢠Word the item so that the required items is
both brief and specific
⢠Do not take statement directly from test book
⢠Blanks for answer should be equal in length
⢠Blank should be in a column to right of the
question
⢠Do not include too many blanks.
71. True or False Test Type
- An objective type test presented in a form simple
declarative statement, to which the pupils respond
indicating whether the statement is true or false. It is
applicable to all learning areas.
72. Suggestion:
Avoid broad general statement
Avoid trivial statement
Avoid using negative statement (double negatives)
Avoid long ,complex sentence
Avoid including two ideas in one statement, except in
measuring cause effect relationship.
If opinion is used attribute it to some source.
True and false statement should be approximately equal in
length
The no of True and False statement should be equal.
73. Types of True or False Test
⢠Simple True or False
⢠Modified True or False
⢠True or False with correction
⢠Cluster True or False
⢠True or False with options.
⢠Fact or Opinion
74. How to construct the true or false type test?
⢠Do not copy statement from the book.
⢠As general practice, keep variety of proportions of
true and false statements.
⢠Express your statement in a simple language as
possible.
⢠Keep your sentences reasonably short or restrict
them to have one central area.
75. How to construct the true or false type test?
⢠If you wish to score the papers right minus wrong (R-
W), state so in the directions.
⢠Be very careful about the grammatical structure of
the sentences.
76. Example:
Direction: Write T if the statement is true and F if
the statement is false. Write your answer on
the space provided before the number.
1. The serial number of the last score in ranking is the
same as the number of cases.
2. Identify is the behavioral term of comprehension.
3. Synthesis proposes a plan for experimental from
different areas into plan for solving problems.
4. Criterion-Referenced test is a measure which
compares studentâs performance with other
studentsâ performance in the class.
77. Identification type of test
- An object type of test in a form of completion test
which is defined, describe, explained or indicated by
a picture, diagram or a concrete object and the term
referred to is supplied by the pupil or student.
78. How to construct the identification type test
1. The definition, description or explanation of the
term may be given by means of phrase or
incomplete statement if its not indicated by a
picture, diagram or complete object.
2. The statement should also be phrased that there is
only one response.
79. How to construct the identification type test
Ex. Direction: Identify the following
1. The best and the most accurate measure of
variability.
Answer: Standard Deviation
2. It is the process of evaluating a test of
evaluating a test item to determine value,
discriminating power an the effectiveness of the
item.
Answer: Item Analysis
81. Multiple-choice Test
⢠Is a test used to measure knowledge
outcomes and other types of learning
outcomes such as comprehension and
applications.
⢠Most commonly used format in measuring
student achievements in different levels
of learning.
82. Characteristics:
It consists of a problem and list of suggested solutions.
The problem may be stated as direct question or an incomplete
statement it is called âstem of the itemâ. The lists of suggested
solution include word, number, symbol, or phrase and or called
alternatives. The correct alternative is called the answer or key
and the remaining alternative are called distracter.
83. Multiple-choice item
⢠Consist of three
1. Stem â represents the problem question.
2. Keyed option â correct answer
3. Distracters â incorrect options or alternatives
84. GENERAL GUIDELINES in Constructing the
Multiple-choice Test
1. Make a practical test.
2. Use diagram or drawing when asking questions
about application, analysis or evaluation.
3. Use tables, figures, or charts when asking question to interpret.
85. GENERAL GUIDELINES in Constructing the
Multiple-choice Test
4. Use pictures when students are required to apply
concepts and principles.
5. When ask to interpret or evaluate quotations,
present actual quotations
86. GENERAL GUIDELINES in Constructing the
Multiple-choice Test
6. List of choices vertically not horizontally.
7. Avoid trivial questions.
8. Use one correct answer only.
9. Use to three to five options.
10. Be sure to use effective distracters.
11. Increase similarity of the options.
12. Do not use ânone of the aboveâ when asking for best for a
best answer.
11. Avoid using âall of the aboveâ options.
87. Guidelines in Constructing the Stem
1. The stem should be written in question form or completion form.
2. Do not leave blank at the beginning or at the middle of the stem
in completion form.
3. Clear and concise.
4. In a positive form. Avoid using negative words, if you do underline
or capitalize. Ex: Which of the following does not belong to the group?
Or which of the following does NOT belong to the group?
5. Stem is grammatically correct.
88. Stem may be a direct question or incomplete statement .the
direct question is easier to write .the incomplete statement is
more concise.
Example:
Direct-question form
Which one of the following glands produces growth hormone?
a. Thyroid b. Anterior pituitary c. Adrenal cortex d.
Pancreas
Incomplete-statement form
Growth hormone is produced by
a. Thyroid b. Anterior pituitary c. Adrenal cortex d.
Pancreas
92. Which of the following philosophical schools was most identified with the Greek
Philosopher Aristotle?
A. Stoicism
B. Agnosticism
C. Platonism
D. Empiricism
⢠Guidelines in Constructing Options
1. One correct or best answer only.
2. List of options are vertical.
3. Avoid creating a pattern.
4. Options must be homogenous in content.
5. As much as possible options must be in the same length.
6. Avoid the phrase âall of the aboveâ, ânone of the aboveâ
or âI donât know.â
93. Which of the following philosophical schools was most identified with the
Greek Philosopher Aristotle?
A. Stoicism
B. Agnosticism
C. Platonism
D. Empiricism
⢠Guidelines in Constructing Distracters
1. The distracters should be plausible.
2. Should be equally popular with the rest of the options.
94.
95.
96.
97.
98.
99. Advantages;
⢠Measures learning outcomes.
⢠Scoring is highly objective, easy and reliable.
⢠Scores are reliable than subjective type of test.
⢠Distracters can provide diagnostic information.
100. Disadvantages;
⢠Time consuming
⢠Difficult to construct plausible distracters.
⢠In some cases, there are more than one possible answer.
⢠Ineffective in assessing problem solving skills of the
students.
⢠Not applicable in assessing the studentsâ ability to
organize and express ideas.
101. GOAL
Our goal is to design questions that students who
understand will answer correctly and students who do
not understand will answer incorrectly.
102. 4. Multiple response items
Here more than one of the given alternatives is correct but
there is only one correct answer to the precise question stated
in the first sentence of the item. This is versatile type of
objective test, lending itself to the testing of recall, reasoning
and the exercise of judgment.
103. Example:
Which of the following would be included among the group of
potassium sparing diuretics?
1. Diamox and bumet
2. Chlorthalidoneandchlorthiazide
3. Spironlactone and traimterene
4. Frusemide and ethacrynic acid
a) III only
b) II only
c) I and IV only
d) II and III
104. Simple Recall Test
⢠Objective type of test that sometimes require the
student to supply an answer to direct question and
sometimes require him to complete a statement
where a word or phrase has been omitted.
105. How to construct Simple Recall Test
⢠Do not lift statement from the book verbatim.
⢠Frame the questions so that one correct answer is
possible.
⢠Avoid extraneous hints that give the students clues
to the answer.
⢠Design the test items so that the blank comes at the
end of the statement.
106. How to construct Simple Recall Test
⢠If the item requires the pupils to compute figures in
order to arrive at an answer, always indicate the
units to express the answer, whether it is feet, inches
or in seconds, minutes or hours and minutes.
107. Example
Direction: Follow the directions in each of the problem
below. Write the answer to each problem in
the blank provided at the left. You may use
the side of this sheet for computation.
14.90 1. The following are the scores of Mathematics
student 18, 12, 16, 10, 10, 22, 15, 13, 17, & 18. What
is the mean?
A. Benit 2. Who is the father of mental/modern testing?
108. Labeling Type Test
⢠An objective test in which the names of parts of
diagrams, map, drawing or picture are to be
indicated.
109. How to construct the labeling type of test
⢠Make the diagram, map, drawing or picture to be
labels very clear and recognizable especially the
parts to be labeled.
⢠The parts to be labeled should indicate by the arrows
so that labels can be written in a vertical column in a
definite place and not on the face the diagram, map,
drawing or picture.
⢠Labeling can be matching type of test if the labels
with some extras are given.
110. Enumeration Type Test
⢠An objective type test in which there are to or more
responses to an item.
111. Matching Items
⢠Matching items require students to match a
series of stems or premises to a response or
principle. They consist of a set of directions, a
column of statements and a column of
responses.
112. Advantages of matching questions
Matching questions are particularly good at assessing a
student's understanding of relationships.
113. Tips for writing good matching questions include
â˘Provide clear directions
â˘Keep the information in each column as
homogeneous as possible
â˘Allow the responses to be used more than once
â˘Arrange the list of responses systematically if
possible (chronological, alphabetical, Numerical)
â˘Include more responses than stems to help prevent
students using a process of elimination to answer
question.
114.
115. Extended matching questions
⢠Extended matching questions is a variation of MCQ where the
stem of the question usually contains a scenario followed by a
long list of items each of which must be matched with a list of
options. Either list of options may be longer than the other
and the same answer may be required more than once. These
question types are popular in medicine and if well written can
test higher level skills
116. ⢠Example:Theme- Fatigue
⢠Options:
A. Acute leukaemia
B. Anaemia of chronic disease
C. Congestive heart failure
D. Depression
E. Epstein-Barr virus infection
F. Folate deficiency
G. Glucose 6-phosphate dehydrogenase
H. Hereditary spherocytosis
I. Hypothyroidism
J. Iron deficiency
K. Lyme disease
L. Microangiopathic hemolytic anemia
M. Miliary tuberculosis
N. Vitamin B12 (cyanocobalamin)
117. Merits of objective test
⢠They are more objective in scoring
⢠They may be very comprehensive and can be constructed
in a way to cover more subject matter
⢠Time is devoted to thinking process
⢠They are very easy to score
⢠As key is prepared any one can evaluate
⢠Introduction of OMR technique increased the scope of
objective test
⢠There is no question regarding the accuracy of marks
118. ⢠There is no space for bias
⢠Less tiring
⢠They are more educative
⢠They are more reliable
⢠Irrespective of the examiner the sc ore will be same
⢠Mood of the examiner in no way affect scoring
⢠Requires less time
⢠It eliminates extraneous factors
119. Demerits of objective tests
â˘No opportunity to show his ability
â˘Students miss the valuable experience of making comparison,
giving explanations or giving definitions
â˘They are not asked to summarize the material to make application
of principles.
â˘Reasoning is stopped and starts guessing
â˘It fails to test the character building aspects
120. ABSTRACT
ASSESSMENT AND GRADING PRACTICES
IN SCHOOLS OF NURSING: NATIONAL SURVEY
FINDINGS
AUTHOR: Marilyn H. Oermann
YEAR: 2011
121. ABSTRACT :
⢠In 2011, the Evaluation of Learning Advisory Council
of the National League for Nursing conducted a survey on
the assessment and evaluation strategies and grading
practices used by nurse faculty in prelicensure RN
programs. This article describes how faculty evaluate
student learning in the cognitive and affective domains
and factors that influence their decisions about
assessment and grading. A 29-item web-based survey
was completed by 1,573 nurse faculty from all types of
prelicensure programs.