4. Activity: Arrange the following steps in preparing
the table of specification used by the test
constructor.
Make a two- way chart of a table of specification
Make an outline of the subject matter to be
covered in the test
Construct the test items
Select the learning outcomes to be
measured
Decide on the number of items per subtopic
5. Philippine Professional Standards for Teachers (PPST)
Domain 5: Assessment and Reporting
Domain 5 relates to processes associated with a variety of
assessment tools and strategies used by teachers in
monitoring, evaluating, documenting and reporting learners’
needs, progress and achievement. This Domain concerns the
use of assessment data in a variety of ways to inform and
enhance the teaching and learning process and programs. It
concerns teachers providing learners with the necessary
feedback about learning outcomes. This feedback informs the
reporting cycle and enables teachers to select, organize and
use sound assessment processes.
6. Domain 5, Assessment and Reporting, is composed of five
strands:
1. Design, selection, organization and utilization of assessment
strategies
2. Monitoring and evaluation of learner progress and
achievement
3. Feedback to improve learning
4. Communication of learner needs, progress and achievement
to key stakeholders
5. Use of assessment data to enhance teaching and learning
practices and programs
7. Domain 5, Assessment and Reporting, is composed of five
strands:
1.Design, selection,
organization and utilization
of assessment strategies
8. Table of Specification
• A chart or table that details the content and level of cognitive
assessed on a test as well as the types and emphases of test of items
• Very important in addressing the validity and reliability of the test
items
• Provides the test constructor a way to ensure that the assessment is
based on the intended learning outcomes
• A way of ensuring that the number of questions on the test is
adequate to ensure dependable results that are not likely caused by
chance
• A useful guide in constructing a test and in determining the type of
test items that you need to construct
9. DO 79, S. 2003 – ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF
STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS
AMENDED BY DO 82, S. 2003 – AMENDMENT OF DEPED ORDER NO. 79 S. 2003 ON
ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’
PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS
1. This Department, responding to the need for an assessment and evaluation system that truly
reflects student performance, issues the following guidelines in the assessment and reporting of
students’ progress:
1.1 Grades shall not be computed on the basis of any transmutation table that equates zero to a
pre-selected base (such as 50 or 70) and adjusts other scores accordingly.
1.2 Grades shall be based on assessment that covers the range of learning competencies
specified in the Philippine Elementary Learning Competencies (PELC) and Philippine Secondary
Schools Learning Competencies (PSSLC). The test shall be designed as follows:
- 60% easy items focused on basic content and skills expected of a student in
each grade or year level;
-30% medium-level items focused on higher level skills; and
-10% difficult items focused on desirable content or skills that aim to distinguish the fast learners.
10. DO 33, s 2004 - Implementing Guidelines on the
Performance-Based Grading System for SY 2004-2005
3. In assessing learning outcomes, the construction of the test
design should consist of 60% basic items, 30% more advanced
items and 10% items for distinguishing honor students.
Questions in each category should have different weights. Test
and non-test items should cover only materials actually taken up
in class.
Factual information (easy) – 60%
Moderately difficult (average) – 30%
Higher order thinking skills (difficult) – 10%
13. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
1. Knowledge: Remembering or
retrieving previously learned
material.
Examples of verbs that relate to this
function are: identify, relate. List.
Define, recall, memorize, repeat,
record, name, recognize, acquire
1. Remembering: Objectives written
on the remembering level –
retrieving, recalling, or recognizing
knowledge from memory.
Remembering is when memory is
used to produce definitions, facts,
or lists; to recite or retrieve
material.
Sample verbs: state, tell, underline,
locate, match, state, spell, fill in the
blank, identify, relate, list, define,
recall, memorize, repeat, record,
name, recognize, acquire
14. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
2. Comprehension: The ability to
grasp or construct meaning from
material.
Examples of verbs: restate, locate,
report, recognize, explain, express,
identify, discuss, describe, review,
infer, conclude, illustrate, interpret,
draw, represent, differentiate
2. Understanding: Constructing
meaning from different types of
functions be they written or graphic
message activities like interpreting,
exemplifying, classifying,
summarizing, inferring, comparing
and explaining.
Sample verbs: restate, locate, report,
recognize, explain, express, identify,
discuss, describe, review, infer,
conclude, illustrate, interpret, draw,
represent, differentiate
15. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
3. Application: The ability to use
learned material, or to implement
material in new and concrete
situations.
Examples of verbs: apply, relate,
develop, translate, use, operate,
organize, employ, restructure,
interpret, demonstrate, illustrate,
practice, calculate, show, exhibit,
dramatize
3. Applying: Carrying out or using a
procedure through executing, or
implementing. Applying relates and
refers to situations where learned
material is used through products
like models, presentations,
interviews or simulations.
Sample verbs: apply, relate, develop,
translate, use, operate, organize,
employ, restructure, interpret,
demonstrate, illustrate, practice,
calculate, show, exhibit, dramatize
16. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
4. Analysis: The ability to break down
or distinguish the parts of the
material into their components so
that their organizational structure
may be better understood.
Examples of verbs: analyze, compare,
probe, inquire, examine, contrast,
categorize, differentiate, investigate,
detect, survey, classify, deduce,
experiment, scrutinize, discover,
inspect, dissect, discriminate,
separate
4. Analyzing: Breaking material or
concepts into parts, determining how
the parts relate or interrelate to one
another or to an overall structure or
purpose. Mental actions include in
this function are differentiating,
organizing and attributing, as well as
being able to distinguish between
the components or parts. When one
is analyzing, he/she can illustrate this
mental function by creating
spreadsheets, surveys, charts, or
diagrams, or graphic representations.
17. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
5. Synthesis: The ability to put parts
together to form a coherent or
unique new whole.
Examples of verbs: produce, design,
assemble, create, prepare, predict,
modify, plan, invent, formulate,
collect, set up, generalize, document,
combine, propose, develop, arrange,
construct, organize, originate, derive,
write
5. Evaluating: Making judgments
based on criteria and standards
through checking and critiquing.
Critiques, recommendations, and
reports are some of the products
that can be created to demonstrate
the processes of evaluation.
Sample verbs: appraise, choose,
compare, conclude, decide, defend,
evaluate, give your opinion, judge,
justify, prioritize, rank, rate, select,
support, value
18. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
6. Evaluation: The ability to judge,
check, and even critique the value of
material for a given purpose.
Examples of verbs: judge, assess,
compare, evaluate, conclude,
measure, deduce, argue, decide,
choose, rate, select, estimate,
validate, consider, appraise, value,
criticize, infer
6. Creating: Putting elements together to
form a coherent or functional whole;
reorganizing elements into a new pattern
or structure through generating, planning,
or producing. Creating requires users to put
parts together in a new way or synthesize
parts into something new and different
form or product. This process is the most
difficult mental function in the new
taxonomy. Sample verbs – change,
combine, compose, construct, create,
invent, design, formulate, generate,
produce, revise, reconstruct, rearrange,
visualize, write, plan
19.
20. Table of Specification
Learning
Competency
Number of
Days
Number of
Items
Item Placement
Cognitive Level
Remembering
Understanding
(60%)
Easy
Applying
(30%)
Average
Analyzing
Evaluating
Creating
(10%)
Difficult
Basic Concepts of
Fractions
1 5 1-5
Addition of Fractions 1 5 6-10
Subtraction of Fractions 1 5 11-15
Multiplication and
Division of Fractions
3 15 16-30 31-40
Application/
Problem Solving
4 20 41-45 46-50
Total 10 50 30 15 5
21. How to Determine the No. of Items?
Formula:
No. of items =
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑥 𝑑𝑒𝑠𝑖𝑟𝑒𝑑 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠
𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠
Example:
Learning Competency: Multiplication and Division of Fractions
Number of days: 3
Desired no. of items: 50
Total no. of class sessions: 10
No. of items =
3 𝑥 50
10
= 15
22. Check your understanding:
Directions: Complete the table by supplying the no. of items for each
learning competency.
Learning Competency No. of Class Sessions No. of Items
Musculo-Skeletal System 2
Integumentary System 2
Digestive System 3
Respiratory System 3
Circulatory System 4
Total 14
23. Workshop – Making Table of Specification
Using your Curriculum Guide, make a
table of specification for Periodic Test per
subject area in each quarter.
26. 1. Identify the different rules in constructing
multiple choice test
2. Construct multiple choice test
Objectives:
27.
28. Research indicates . . .
• Teachers tend to use tests that they have prepared
themselves much more often than any other type of
test. (How Teaching Matters, National Council for
Accreditation of Teacher Education, Oct. 2000)
• While assessment options are diverse, most classroom
educators rely on text and curriculum-embedded
questions, and tests that are overwhelmingly classified
as paper-and-pencil (National Commission on Teaching
and America’s Future, 1996)
29. Research indicates . . .
• Formal training in paper-and-pencil test construction may occur
at the preservice level (52% of the time) or as in-service
preparation (21%). A significant number of professional
educators (48%) report no formal training in developing,
administering, scoring, and interpreting tests (Education Week,
“National Survey of Public School Teachers, 2000).
• Students report a higher level of test anxiety over teacher-made
tests (64%) than over standardized tests (30%). The top three
reasons why: poor test construction, irrelevant or
obscure material coverage, and unclear directions.
(NCATE, “Summary Data on Teacher Effectiveness, Teacher
Quality, and Teacher Qualifications”, 2001)
30. Two General Categories of Test Items
1. Objective items which require students to select the
correct response from several alternatives or to supply
a word or short phrase to answer a question or
complete a statement. Objective items include:
multiple choice, true-false, matching, completion
2. Subjective or essay items which permit the student to
organize and present an original answer. Subjective
items include: short-answer essay, extended-response
essay, problem solving, performance test items
31. Creating a test is one of the most
challenging tasks confronting a
teacher.
Unfortunately, many of
us have had little, if any,
preparation in writing
tests.
32. What makes a test good or bad?
The most basic and
obvious answer to that
question is that good tests
measure what you want to
measure, and bad tests do
not.
33. When to use objective tests?
Objective tests are appropriate when:
The group to be tested is large and the test may
be reused.
Highly reliable scores must be obtained as
efficiently as possible.
Impartiality of evaluation, fairness, and
freedom from possible test scoring influences
are essential.
34. When to use objective tests?
Objective tests can be used to:
Measure almost any important educational
achievement a written test can measure.
Test understanding and ability to apply
principles.
Test ability to think critically.
Test ability to solve problems
35. The matching of
learning objective
expectations with
certain item types
provides a high
degree of test
validity: testing what
is supposed to be
tested.
36. Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
1. Name the parts of the human skeleton.
Answer: A
2. Appraise a composition on the basis of its organization.
Answer: C
3. Demonstrate safe laboratory skills.
Answer: B
37. Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
4. Cite four examples of satire that Twain uses in Huckleberry Finn.
Answer: D
5. Design a logo for a web page.
Answer: B
6. Describe the impact of a bull market.
Answer: C
7. Diagnose a physical ailment.
Answer: B
38. Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
8. List important mental attributes necessary for an athlete.
Answer: D
9. Categorize great American fiction writers.
Answer: A
10. Analyze the major causes of learning disabilities.
Answer: C
39. In general, test items should . . .
•Assess achievement of instructional objectives
•Measure important aspects of the subject
(concepts and conceptual relations)
•Accurately reflect the emphasis placed on
important aspects of instruction
•Measure an appropriate level of student
knowledge
•Vary in levels of difficulty
40. Technical Quality of a Test
1. Cognitive Complexity
The test questions will focus on appropriate intellectual
activity ranging from simple recall of facts to problem
solving, critical thinking, and reasoning.
2. Content Quality
The test questions will permit students to demonstrate their
knowledge of challenging and important subject matter.
3. Meaningfulness
The test questions will be worth students’ time and students
will recognize and understand their value.
41. Technical Quality of a Test
4. Language Appropriateness
The language demands will be clear and
appropriate to the assessment tasks and to
students.
5. Transfer and Generalizability
Successful performance on the test will allow
valid generalizations about achievement to be
made.
42. Technical Quality of a Test
6. Fairness
Student performance will be measured in a way
that does not give advantage to factors
irrelevant to school learning; scoring schemes
will be similarly equitable.
7. Reliability
Answers to test questions will be consistently
trusted to represent what students know.
45. Question 1:
Multiple choice items provide highly reliable test scores because:
A. They do not place a high degree of dependence on the students
reading ability
B. They place high degree of dependence on a teacher’s writing ability
C. They are subjective measurement of student achievement
D. They allow a wide sampling of content and a reduce guessing factor
Answer: D
46. Question 2:
You should:
A. Always decide on an answer before reading the alternatives
B. Always review your marked exams
C. Never change an answer
D. Always do the multiple choice items on an exam first
Answer: B
47. Question 3:
The multiple choice item on the right is
structurally undesirable because:
A. A direct question is more desirable than
an incomplete statement
B. There is no explicit problem of
information in the stem
C. The alternatives are not all plausible
D. All of the above
E. A & B only
F. B & C only
G. A & C only
H. None of the above
Answer: D
You should:
A. Always decide on an answer
before reading the
alternatives
B. Always review your marked
exams
C. Never change an answer
D. Always do the multiple
choice items on an exam first
48. Question 4:
Question 3 multiple choice item on the
right is undesirable because:
A. It relies on an answer required in a
previous item
B. The stem does not supply enough
information
C. Eight alternatives are too many and
too confusing to the students
D. More alternatives just encourage
guessing
Answer: C
Question 3:
The multiple choice item on the
right is structurally undesirable
because:
A. A direct question is more
desirable than an incomplete
statement
B. There is no explicit problem of
information in the stem
C. The alternatives are not all
plausible
D. All of the above
E. A & B only
F. B & C only
G. A & C only
H. None of the above
49. Question 5
The right answers in multiple choice questions tend to be:
A. Longer and more descriptive
B. The same length as the wrong answer
C. At least a paragraph long
D. Short
Answer: A
50. Question 6
When guessing on a multiple choice question with numbers in the
answer
A. Always pick the most extreme
B. Pick the lowest range
C. Pick answers in the middle range
D. Always pick C
Answer: C
51. Question 7:
What is the process of elimination in a multiple choice question?
A. Skipping the entire question
B. Eliminating all answers with extreme modifiers
C. Just guessing
D. Eliminating the wrong answers
Answer: D
52. Question 8
It is unlikely that a student who is unskilled in untangling negative
statements will:
A. Quickly understand multiple choice items not written in this way
B. Not quickly understand multiple choice items not written in this
way
C. Quickly understand multiple choice items written in this way
D. Not quickly understand multiple choice items written in this way
Answer: C
53. Multiple Choice Test Items
MC item consist of the stem, which identifies the question or problem
and the response alternatives or choices. Usually, students are asked to
select the one alternative that best completes a statement or answer a
question.
Item Stem: Which of the following is a chemical change?
Response Alternatives: A. Evaporation of alcohol
B. Freezing of water
C. Burning of oil
D. Melting of wax
54. General Guidelines in Constructing MC Test
1. Make the test item that is practical or with real-world
applications to the students.
2. Use diagrams or drawings when asking questions about
application, analysis or evaluation.
3. When ask to interpret or evaluate about quotations,
present actual quotations from secondary sources like
published books or newspapers.
4. Use tables, figures, or charts when asking questions to
interpret.
5. Use pictures if possible when students are required to
apply concepts and principles.
55. General Guidelines in Constructing MC Test
6. List the choices/options vertically not horizontally.
7. Avoid trivial questions.
8. Use only one correct answer or best answer format.
9. Use three to five options to discourage guessing.
10. Be sure that distracters are plausible and effective.
11. Increase the similarity of the options to increase the difficulty of
the item.
12. Do not use “none of the above” options when asking for a best
answer.
13. Avoid using “all of the above” options. It is usually the correct
answer and makes the item too easy for the examinee with partial
knowledge.
56. Guidelines in Constructing the Stem
1. The stem should be written in question form or completion form.
Research showed that it is more advisable to use question form.
2. Do not leave the blank at the beginning or at the middle of the stem when
using completion form of multiple-choice type of test.
3. The stem should pose the problem completely.
4. The stem should be clear and concise.
5. Avoid excessive and meaningless use of words in the stem.
6. State the stem in positive form. Avoid using the negative phrase like “not”
or “except.” Underline or capitalize the negative words if it can be avoided.
Example: Which of the following does not belong to the group? or Which
of the following does NOT belong to the group?
7. Avoid grammatical clues in the correct answer.
57. Guidelines in Constructing Options
1. There should be one correct or best answer in each item.
2. List options in vertical order not horizontal order beneath the stem.
3. Arrange the options in logical order and use capital letters to
indicate each option such as A, B, C, D, E.
4. No overlapping options; keep it independent.
5. All options must be homogenous in content to increase the
difficulty of an item.
6. As much as possible the length of the options must be the same or
equal.
7. Avoid using the phrase “all of the above.”
8. Avoid using the phrase “none of the above” or “I don’t know.”
58. Guidelines in Constructing the Distracters
1. The distracters should be plausible.
2. The distracters should be equally popular to all examinees.
3. Avoid using ineffective distracters. Replace distracter(s)
that are not effective to the examinees.
4. Each distracter should be chosen by at least 5% but not
more than the key answer.
5. Revise distracter(s) that are over attractive to the teachers.
They might be ambiguous to the examinees.
59. Advantages of MC Test
1. Measures learning outcomes from the knowledge to
evaluation level.
2. Scoring is highly objective, easy and reliable.
3. Scores are more reliable than subjective type of test.
4. Measures broad samples of content within a short
span of time.
5. Distracters can provide diagnostic information.
6. Item analysis can reveal the difficulty of an item and
can discriminate the good and performing students.
60. Disadvantages of MC Test
1. Time consuming to construct a good item.
2. Difficult to find effective and plausible distracters.
3. Scores can be influenced by the reading ability of the
examinees.
4. In some cases, there is more than one justifiable correct
answer.
5. Ineffectiveness in assessing the problem solving skills of
the students.
6. Not applicable when assessing the students’ ability to
organize and express ideas.
61. Activity: Improve Mo Ako!
Directions: The following multiple choice
questions are poorly constructed. Write a
better version of the question.
72. “Understand that there is always one clearly best
answer. Your goal is not to trick students or require
them to make difficult judgments about two
options that are nearly equally correct. Your goal is
to design questions that students who understand
will answer correctly and students who do not
understands will answer incorrectly.”
John A. Johnson
Dept. of Psychology
Penn State University
73. POINTS TO PONDER. . .
A good lesson makes a good question
A good question makes a good content
A good content makes a good test
A good test makes a good grade
A good grade makes a good student
A good student makes a good COMMUNITY
Jesus Ochave Ph.D.
VP Research Planning and Development
Philippine Normal University