SlideShare a Scribd company logo
Test Construction
How to Write Effective Test
Questions
In-Service Training for Teachers
April 13-14, 2020
SDO Lucena
1.Discuss the steps in developing
table of specification
2.Construct a table of specification
Objectives:
TABLE OF SPECIFICATION
Activity: Arrange the following steps in preparing
the table of specification used by the test
constructor.
Make a two- way chart of a table of specification
Make an outline of the subject matter to be
covered in the test
Construct the test items
Select the learning outcomes to be
measured
Decide on the number of items per subtopic
Philippine Professional Standards for Teachers (PPST)
Domain 5: Assessment and Reporting
Domain 5 relates to processes associated with a variety of
assessment tools and strategies used by teachers in
monitoring, evaluating, documenting and reporting learners’
needs, progress and achievement. This Domain concerns the
use of assessment data in a variety of ways to inform and
enhance the teaching and learning process and programs. It
concerns teachers providing learners with the necessary
feedback about learning outcomes. This feedback informs the
reporting cycle and enables teachers to select, organize and
use sound assessment processes.
Domain 5, Assessment and Reporting, is composed of five
strands:
1. Design, selection, organization and utilization of assessment
strategies
2. Monitoring and evaluation of learner progress and
achievement
3. Feedback to improve learning
4. Communication of learner needs, progress and achievement
to key stakeholders
5. Use of assessment data to enhance teaching and learning
practices and programs
Domain 5, Assessment and Reporting, is composed of five
strands:
1.Design, selection,
organization and utilization
of assessment strategies
Table of Specification
• A chart or table that details the content and level of cognitive
assessed on a test as well as the types and emphases of test of items
• Very important in addressing the validity and reliability of the test
items
• Provides the test constructor a way to ensure that the assessment is
based on the intended learning outcomes
• A way of ensuring that the number of questions on the test is
adequate to ensure dependable results that are not likely caused by
chance
• A useful guide in constructing a test and in determining the type of
test items that you need to construct
DO 79, S. 2003 – ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF
STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS
AMENDED BY DO 82, S. 2003 – AMENDMENT OF DEPED ORDER NO. 79 S. 2003 ON
ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’
PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS
1. This Department, responding to the need for an assessment and evaluation system that truly
reflects student performance, issues the following guidelines in the assessment and reporting of
students’ progress:
1.1 Grades shall not be computed on the basis of any transmutation table that equates zero to a
pre-selected base (such as 50 or 70) and adjusts other scores accordingly.
1.2 Grades shall be based on assessment that covers the range of learning competencies
specified in the Philippine Elementary Learning Competencies (PELC) and Philippine Secondary
Schools Learning Competencies (PSSLC). The test shall be designed as follows:
- 60% easy items focused on basic content and skills expected of a student in
each grade or year level;
-30% medium-level items focused on higher level skills; and
-10% difficult items focused on desirable content or skills that aim to distinguish the fast learners.
DO 33, s 2004 - Implementing Guidelines on the
Performance-Based Grading System for SY 2004-2005
3. In assessing learning outcomes, the construction of the test
design should consist of 60% basic items, 30% more advanced
items and 10% items for distinguishing honor students.
Questions in each category should have different weights. Test
and non-test items should cover only materials actually taken up
in class.
Factual information (easy) – 60%
Moderately difficult (average) – 30%
Higher order thinking skills (difficult) – 10%
Page 142 of
RPMS
Manual
Assessment
Objective MOV
BLOOM’S REVISED TAXONOMY (Anderson and Krathwolh)
1956 2001
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
1. Knowledge: Remembering or
retrieving previously learned
material.
Examples of verbs that relate to this
function are: identify, relate. List.
Define, recall, memorize, repeat,
record, name, recognize, acquire
1. Remembering: Objectives written
on the remembering level –
retrieving, recalling, or recognizing
knowledge from memory.
Remembering is when memory is
used to produce definitions, facts,
or lists; to recite or retrieve
material.
Sample verbs: state, tell, underline,
locate, match, state, spell, fill in the
blank, identify, relate, list, define,
recall, memorize, repeat, record,
name, recognize, acquire
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
2. Comprehension: The ability to
grasp or construct meaning from
material.
Examples of verbs: restate, locate,
report, recognize, explain, express,
identify, discuss, describe, review,
infer, conclude, illustrate, interpret,
draw, represent, differentiate
2. Understanding: Constructing
meaning from different types of
functions be they written or graphic
message activities like interpreting,
exemplifying, classifying,
summarizing, inferring, comparing
and explaining.
Sample verbs: restate, locate, report,
recognize, explain, express, identify,
discuss, describe, review, infer,
conclude, illustrate, interpret, draw,
represent, differentiate
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
3. Application: The ability to use
learned material, or to implement
material in new and concrete
situations.
Examples of verbs: apply, relate,
develop, translate, use, operate,
organize, employ, restructure,
interpret, demonstrate, illustrate,
practice, calculate, show, exhibit,
dramatize
3. Applying: Carrying out or using a
procedure through executing, or
implementing. Applying relates and
refers to situations where learned
material is used through products
like models, presentations,
interviews or simulations.
Sample verbs: apply, relate, develop,
translate, use, operate, organize,
employ, restructure, interpret,
demonstrate, illustrate, practice,
calculate, show, exhibit, dramatize
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
4. Analysis: The ability to break down
or distinguish the parts of the
material into their components so
that their organizational structure
may be better understood.
Examples of verbs: analyze, compare,
probe, inquire, examine, contrast,
categorize, differentiate, investigate,
detect, survey, classify, deduce,
experiment, scrutinize, discover,
inspect, dissect, discriminate,
separate
4. Analyzing: Breaking material or
concepts into parts, determining how
the parts relate or interrelate to one
another or to an overall structure or
purpose. Mental actions include in
this function are differentiating,
organizing and attributing, as well as
being able to distinguish between
the components or parts. When one
is analyzing, he/she can illustrate this
mental function by creating
spreadsheets, surveys, charts, or
diagrams, or graphic representations.
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
5. Synthesis: The ability to put parts
together to form a coherent or
unique new whole.
Examples of verbs: produce, design,
assemble, create, prepare, predict,
modify, plan, invent, formulate,
collect, set up, generalize, document,
combine, propose, develop, arrange,
construct, organize, originate, derive,
write
5. Evaluating: Making judgments
based on criteria and standards
through checking and critiquing.
Critiques, recommendations, and
reports are some of the products
that can be created to demonstrate
the processes of evaluation.
Sample verbs: appraise, choose,
compare, conclude, decide, defend,
evaluate, give your opinion, judge,
justify, prioritize, rank, rate, select,
support, value
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
6. Evaluation: The ability to judge,
check, and even critique the value of
material for a given purpose.
Examples of verbs: judge, assess,
compare, evaluate, conclude,
measure, deduce, argue, decide,
choose, rate, select, estimate,
validate, consider, appraise, value,
criticize, infer
6. Creating: Putting elements together to
form a coherent or functional whole;
reorganizing elements into a new pattern
or structure through generating, planning,
or producing. Creating requires users to put
parts together in a new way or synthesize
parts into something new and different
form or product. This process is the most
difficult mental function in the new
taxonomy. Sample verbs – change,
combine, compose, construct, create,
invent, design, formulate, generate,
produce, revise, reconstruct, rearrange,
visualize, write, plan
Table of Specification
Learning
Competency
Number of
Days
Number of
Items
Item Placement
Cognitive Level
Remembering
Understanding
(60%)
Easy
Applying
(30%)
Average
Analyzing
Evaluating
Creating
(10%)
Difficult
Basic Concepts of
Fractions
1 5 1-5
Addition of Fractions 1 5 6-10
Subtraction of Fractions 1 5 11-15
Multiplication and
Division of Fractions
3 15 16-30 31-40
Application/
Problem Solving
4 20 41-45 46-50
Total 10 50 30 15 5
How to Determine the No. of Items?
Formula:
No. of items =
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑥 𝑑𝑒𝑠𝑖𝑟𝑒𝑑 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠
𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠
Example:
Learning Competency: Multiplication and Division of Fractions
Number of days: 3
Desired no. of items: 50
Total no. of class sessions: 10
No. of items =
3 𝑥 50
10
= 15
Check your understanding:
Directions: Complete the table by supplying the no. of items for each
learning competency.
Learning Competency No. of Class Sessions No. of Items
Musculo-Skeletal System 2
Integumentary System 2
Digestive System 3
Respiratory System 3
Circulatory System 4
Total 14
Workshop – Making Table of Specification
Using your Curriculum Guide, make a
table of specification for Periodic Test per
subject area in each quarter.
Test Construction
1. Identify the different rules in constructing
multiple choice test
2. Construct multiple choice test
Objectives:
Research indicates . . .
• Teachers tend to use tests that they have prepared
themselves much more often than any other type of
test. (How Teaching Matters, National Council for
Accreditation of Teacher Education, Oct. 2000)
• While assessment options are diverse, most classroom
educators rely on text and curriculum-embedded
questions, and tests that are overwhelmingly classified
as paper-and-pencil (National Commission on Teaching
and America’s Future, 1996)
Research indicates . . .
• Formal training in paper-and-pencil test construction may occur
at the preservice level (52% of the time) or as in-service
preparation (21%). A significant number of professional
educators (48%) report no formal training in developing,
administering, scoring, and interpreting tests (Education Week,
“National Survey of Public School Teachers, 2000).
• Students report a higher level of test anxiety over teacher-made
tests (64%) than over standardized tests (30%). The top three
reasons why: poor test construction, irrelevant or
obscure material coverage, and unclear directions.
(NCATE, “Summary Data on Teacher Effectiveness, Teacher
Quality, and Teacher Qualifications”, 2001)
Two General Categories of Test Items
1. Objective items which require students to select the
correct response from several alternatives or to supply
a word or short phrase to answer a question or
complete a statement. Objective items include:
multiple choice, true-false, matching, completion
2. Subjective or essay items which permit the student to
organize and present an original answer. Subjective
items include: short-answer essay, extended-response
essay, problem solving, performance test items
Creating a test is one of the most
challenging tasks confronting a
teacher.
Unfortunately, many of
us have had little, if any,
preparation in writing
tests.
What makes a test good or bad?
The most basic and
obvious answer to that
question is that good tests
measure what you want to
measure, and bad tests do
not.
When to use objective tests?
Objective tests are appropriate when:
The group to be tested is large and the test may
be reused.
Highly reliable scores must be obtained as
efficiently as possible.
Impartiality of evaluation, fairness, and
freedom from possible test scoring influences
are essential.
When to use objective tests?
Objective tests can be used to:
Measure almost any important educational
achievement a written test can measure.
Test understanding and ability to apply
principles.
Test ability to think critically.
Test ability to solve problems
The matching of
learning objective
expectations with
certain item types
provides a high
degree of test
validity: testing what
is supposed to be
tested.
Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
1. Name the parts of the human skeleton.
Answer: A
2. Appraise a composition on the basis of its organization.
Answer: C
3. Demonstrate safe laboratory skills.
Answer: B
Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
4. Cite four examples of satire that Twain uses in Huckleberry Finn.
Answer: D
5. Design a logo for a web page.
Answer: B
6. Describe the impact of a bull market.
Answer: C
7. Diagnose a physical ailment.
Answer: B
Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
8. List important mental attributes necessary for an athlete.
Answer: D
9. Categorize great American fiction writers.
Answer: A
10. Analyze the major causes of learning disabilities.
Answer: C
In general, test items should . . .
•Assess achievement of instructional objectives
•Measure important aspects of the subject
(concepts and conceptual relations)
•Accurately reflect the emphasis placed on
important aspects of instruction
•Measure an appropriate level of student
knowledge
•Vary in levels of difficulty
Technical Quality of a Test
1. Cognitive Complexity
The test questions will focus on appropriate intellectual
activity ranging from simple recall of facts to problem
solving, critical thinking, and reasoning.
2. Content Quality
The test questions will permit students to demonstrate their
knowledge of challenging and important subject matter.
3. Meaningfulness
The test questions will be worth students’ time and students
will recognize and understand their value.
Technical Quality of a Test
4. Language Appropriateness
The language demands will be clear and
appropriate to the assessment tasks and to
students.
5. Transfer and Generalizability
Successful performance on the test will allow
valid generalizations about achievement to be
made.
Technical Quality of a Test
6. Fairness
Student performance will be measured in a way
that does not give advantage to factors
irrelevant to school learning; scoring schemes
will be similarly equitable.
7. Reliability
Answers to test questions will be consistently
trusted to represent what students know.
Activity: Piliin Mo Ako!
Directions: Choose the letter of the best
answer.
Question 1:
Multiple choice items provide highly reliable test scores because:
A. They do not place a high degree of dependence on the students
reading ability
B. They place high degree of dependence on a teacher’s writing ability
C. They are subjective measurement of student achievement
D. They allow a wide sampling of content and a reduce guessing factor
Answer: D
Question 2:
You should:
A. Always decide on an answer before reading the alternatives
B. Always review your marked exams
C. Never change an answer
D. Always do the multiple choice items on an exam first
Answer: B
Question 3:
The multiple choice item on the right is
structurally undesirable because:
A. A direct question is more desirable than
an incomplete statement
B. There is no explicit problem of
information in the stem
C. The alternatives are not all plausible
D. All of the above
E. A & B only
F. B & C only
G. A & C only
H. None of the above
Answer: D
You should:
A. Always decide on an answer
before reading the
alternatives
B. Always review your marked
exams
C. Never change an answer
D. Always do the multiple
choice items on an exam first
Question 4:
Question 3 multiple choice item on the
right is undesirable because:
A. It relies on an answer required in a
previous item
B. The stem does not supply enough
information
C. Eight alternatives are too many and
too confusing to the students
D. More alternatives just encourage
guessing
Answer: C
Question 3:
The multiple choice item on the
right is structurally undesirable
because:
A. A direct question is more
desirable than an incomplete
statement
B. There is no explicit problem of
information in the stem
C. The alternatives are not all
plausible
D. All of the above
E. A & B only
F. B & C only
G. A & C only
H. None of the above
Question 5
The right answers in multiple choice questions tend to be:
A. Longer and more descriptive
B. The same length as the wrong answer
C. At least a paragraph long
D. Short
Answer: A
Question 6
When guessing on a multiple choice question with numbers in the
answer
A. Always pick the most extreme
B. Pick the lowest range
C. Pick answers in the middle range
D. Always pick C
Answer: C
Question 7:
What is the process of elimination in a multiple choice question?
A. Skipping the entire question
B. Eliminating all answers with extreme modifiers
C. Just guessing
D. Eliminating the wrong answers
Answer: D
Question 8
It is unlikely that a student who is unskilled in untangling negative
statements will:
A. Quickly understand multiple choice items not written in this way
B. Not quickly understand multiple choice items not written in this
way
C. Quickly understand multiple choice items written in this way
D. Not quickly understand multiple choice items written in this way
Answer: C
Multiple Choice Test Items
MC item consist of the stem, which identifies the question or problem
and the response alternatives or choices. Usually, students are asked to
select the one alternative that best completes a statement or answer a
question.
Item Stem: Which of the following is a chemical change?
Response Alternatives: A. Evaporation of alcohol
B. Freezing of water
C. Burning of oil
D. Melting of wax
General Guidelines in Constructing MC Test
1. Make the test item that is practical or with real-world
applications to the students.
2. Use diagrams or drawings when asking questions about
application, analysis or evaluation.
3. When ask to interpret or evaluate about quotations,
present actual quotations from secondary sources like
published books or newspapers.
4. Use tables, figures, or charts when asking questions to
interpret.
5. Use pictures if possible when students are required to
apply concepts and principles.
General Guidelines in Constructing MC Test
6. List the choices/options vertically not horizontally.
7. Avoid trivial questions.
8. Use only one correct answer or best answer format.
9. Use three to five options to discourage guessing.
10. Be sure that distracters are plausible and effective.
11. Increase the similarity of the options to increase the difficulty of
the item.
12. Do not use “none of the above” options when asking for a best
answer.
13. Avoid using “all of the above” options. It is usually the correct
answer and makes the item too easy for the examinee with partial
knowledge.
Guidelines in Constructing the Stem
1. The stem should be written in question form or completion form.
Research showed that it is more advisable to use question form.
2. Do not leave the blank at the beginning or at the middle of the stem when
using completion form of multiple-choice type of test.
3. The stem should pose the problem completely.
4. The stem should be clear and concise.
5. Avoid excessive and meaningless use of words in the stem.
6. State the stem in positive form. Avoid using the negative phrase like “not”
or “except.” Underline or capitalize the negative words if it can be avoided.
Example: Which of the following does not belong to the group? or Which
of the following does NOT belong to the group?
7. Avoid grammatical clues in the correct answer.
Guidelines in Constructing Options
1. There should be one correct or best answer in each item.
2. List options in vertical order not horizontal order beneath the stem.
3. Arrange the options in logical order and use capital letters to
indicate each option such as A, B, C, D, E.
4. No overlapping options; keep it independent.
5. All options must be homogenous in content to increase the
difficulty of an item.
6. As much as possible the length of the options must be the same or
equal.
7. Avoid using the phrase “all of the above.”
8. Avoid using the phrase “none of the above” or “I don’t know.”
Guidelines in Constructing the Distracters
1. The distracters should be plausible.
2. The distracters should be equally popular to all examinees.
3. Avoid using ineffective distracters. Replace distracter(s)
that are not effective to the examinees.
4. Each distracter should be chosen by at least 5% but not
more than the key answer.
5. Revise distracter(s) that are over attractive to the teachers.
They might be ambiguous to the examinees.
Advantages of MC Test
1. Measures learning outcomes from the knowledge to
evaluation level.
2. Scoring is highly objective, easy and reliable.
3. Scores are more reliable than subjective type of test.
4. Measures broad samples of content within a short
span of time.
5. Distracters can provide diagnostic information.
6. Item analysis can reveal the difficulty of an item and
can discriminate the good and performing students.
Disadvantages of MC Test
1. Time consuming to construct a good item.
2. Difficult to find effective and plausible distracters.
3. Scores can be influenced by the reading ability of the
examinees.
4. In some cases, there is more than one justifiable correct
answer.
5. Ineffectiveness in assessing the problem solving skills of
the students.
6. Not applicable when assessing the students’ ability to
organize and express ideas.
Activity: Improve Mo Ako!
Directions: The following multiple choice
questions are poorly constructed. Write a
better version of the question.
Item 1:
Item 2:
Item 3:
Item 4
Item 5
Item 6
Item 7
Poor Item Better Item
Item 8
Poor Item Better Item
Item 9
Poor Item Better Item
Item 10
Poor Item Better Item
“Understand that there is always one clearly best
answer. Your goal is not to trick students or require
them to make difficult judgments about two
options that are nearly equally correct. Your goal is
to design questions that students who understand
will answer correctly and students who do not
understands will answer incorrectly.”
John A. Johnson
Dept. of Psychology
Penn State University
POINTS TO PONDER. . .
A good lesson makes a good question
A good question makes a good content
A good content makes a good test
A good test makes a good grade
A good grade makes a good student
A good student makes a good COMMUNITY
Jesus Ochave Ph.D.
VP Research Planning and Development
Philippine Normal University
Test-Construction-B (1).pptx

More Related Content

What's hot

K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMESK to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
Yanne Evangelista
 
Table of specification in math test
Table of specification in math testTable of specification in math test
Table of specification in math test
Larino Jr Salazar Pelaosa
 
Table of specifications
Table of specificationsTable of specifications
Table of specifications
Kate Kimberly Alvarez
 
Assessment and TOS Making
Assessment and TOS MakingAssessment and TOS Making
Assessment and TOS Making
Rich Hagen
 
TEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptxTEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptx
jennyvienerveza
 
Utilization of assessment data
Utilization of assessment dataUtilization of assessment data
Utilization of assessment data
Vernalyn Campang
 
Preparing The Table of Specification
Preparing The Table of SpecificationPreparing The Table of Specification
Preparing The Table of Specification
Mary Eunice Quijano
 
2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx
2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx
2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx
AlbertDalosa
 
Pmcf (2)
Pmcf (2)Pmcf (2)
Pmcf (2)
genducena
 
DepEd order # 8 , s., 2015
DepEd  order # 8 , s., 2015DepEd  order # 8 , s., 2015
DepEd order # 8 , s., 2015
Gilda Galangue
 
Completiontestppt
CompletiontestpptCompletiontestppt
Completiontestppt
Jessa Ariño
 
mid-year-review-form-mrf.docx
mid-year-review-form-mrf.docxmid-year-review-form-mrf.docx
mid-year-review-form-mrf.docx
JeffersonTorres69
 
preparing a TOS
preparing a TOSpreparing a TOS
preparing a TOS
Roxette Layosa
 
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdfRPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
Glenn Rivera
 
Pencil andpapertest
Pencil andpapertestPencil andpapertest
Pencil andpapertest
angellocsin19
 
Table of specifications
Table of specificationsTable of specifications
Table of specifications
April Gealene Alera
 
Classroom Structuring
Classroom StructuringClassroom Structuring
Classroom Structuring
An Vil
 
Guidelines in Preparing Different Types of Tests
Guidelines in Preparing Different Types of TestsGuidelines in Preparing Different Types of Tests
Guidelines in Preparing Different Types of Tests
Jervis Panis
 
Smea presentation sept. 23
Smea presentation   sept. 23Smea presentation   sept. 23
Smea presentation sept. 23
Lindy Pujante
 
Alternative-Response Test
Alternative-Response TestAlternative-Response Test
Alternative-Response Test
MD Pits
 

What's hot (20)

K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMESK to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
K to12 ASSESSMENT AND RATING OF LEARNING OUTCOMES
 
Table of specification in math test
Table of specification in math testTable of specification in math test
Table of specification in math test
 
Table of specifications
Table of specificationsTable of specifications
Table of specifications
 
Assessment and TOS Making
Assessment and TOS MakingAssessment and TOS Making
Assessment and TOS Making
 
TEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptxTEST-CONSTRUCTION-AND-PREPARATION.pptx
TEST-CONSTRUCTION-AND-PREPARATION.pptx
 
Utilization of assessment data
Utilization of assessment dataUtilization of assessment data
Utilization of assessment data
 
Preparing The Table of Specification
Preparing The Table of SpecificationPreparing The Table of Specification
Preparing The Table of Specification
 
2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx
2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx
2022-GUIDELINES-ROOM-EXAMINERS-PROCTORS-1.pptx
 
Pmcf (2)
Pmcf (2)Pmcf (2)
Pmcf (2)
 
DepEd order # 8 , s., 2015
DepEd  order # 8 , s., 2015DepEd  order # 8 , s., 2015
DepEd order # 8 , s., 2015
 
Completiontestppt
CompletiontestpptCompletiontestppt
Completiontestppt
 
mid-year-review-form-mrf.docx
mid-year-review-form-mrf.docxmid-year-review-form-mrf.docx
mid-year-review-form-mrf.docx
 
preparing a TOS
preparing a TOSpreparing a TOS
preparing a TOS
 
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdfRPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
 
Pencil andpapertest
Pencil andpapertestPencil andpapertest
Pencil andpapertest
 
Table of specifications
Table of specificationsTable of specifications
Table of specifications
 
Classroom Structuring
Classroom StructuringClassroom Structuring
Classroom Structuring
 
Guidelines in Preparing Different Types of Tests
Guidelines in Preparing Different Types of TestsGuidelines in Preparing Different Types of Tests
Guidelines in Preparing Different Types of Tests
 
Smea presentation sept. 23
Smea presentation   sept. 23Smea presentation   sept. 23
Smea presentation sept. 23
 
Alternative-Response Test
Alternative-Response TestAlternative-Response Test
Alternative-Response Test
 

Similar to Test-Construction-B (1).pptx

Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116
Abdelghani Qouhafa
 
Factors in constructing evaluative instruments
Factors in constructing evaluative instrumentsFactors in constructing evaluative instruments
Factors in constructing evaluative instruments
Catherine Matias
 
Examination reform policy
Examination reform policy Examination reform policy
Examination reform policy
Dr. Vishal Jain
 
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
paj261997
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
Angel Jones
 
Presentation2
Presentation2Presentation2
Presentation2
Allison barbee
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
cdjhaigler
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
cdjhaigler
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
cdjhaigler
 
Principles of Teaching for LET Reciew
Principles of Teaching for LET ReciewPrinciples of Teaching for LET Reciew
Principles of Teaching for LET Reciew
Kate Cast-Vallar
 
Chapter 11
Chapter 11Chapter 11
Chapter 11
cdjhaigler
 
Lesson 1 bb.docx
Lesson 1 bb.docxLesson 1 bb.docx
Lesson 1 bb.docx
RenzManuelRestar
 
Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention
Ijaz Ahmad
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
JCrawford62
 
Lesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfLesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdf
PrincessAngelMagbanu
 
Bloom’s taxonomy
Bloom’s taxonomyBloom’s taxonomy
Bloom’s taxonomy
Atul Thakur
 
Evaluating the curriculum
Evaluating the curriculumEvaluating the curriculum
Evaluating the curriculum
Syamsul Nor Azlan Mohamad
 
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
Advancing a Massachusetts Culture of Assessment
 
Outcomnes-based Education
Outcomnes-based EducationOutcomnes-based Education
Outcomnes-based Education
Carlo Magno
 
Assessment-of-learning
 Assessment-of-learning Assessment-of-learning
Assessment-of-learning
aqosiAnn
 

Similar to Test-Construction-B (1).pptx (20)

Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116
 
Factors in constructing evaluative instruments
Factors in constructing evaluative instrumentsFactors in constructing evaluative instruments
Factors in constructing evaluative instruments
 
Examination reform policy
Examination reform policy Examination reform policy
Examination reform policy
 
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
 
Presentation2
Presentation2Presentation2
Presentation2
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
 
Principles of Teaching for LET Reciew
Principles of Teaching for LET ReciewPrinciples of Teaching for LET Reciew
Principles of Teaching for LET Reciew
 
Chapter 11
Chapter 11Chapter 11
Chapter 11
 
Lesson 1 bb.docx
Lesson 1 bb.docxLesson 1 bb.docx
Lesson 1 bb.docx
 
Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
Lesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfLesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdf
 
Bloom’s taxonomy
Bloom’s taxonomyBloom’s taxonomy
Bloom’s taxonomy
 
Evaluating the curriculum
Evaluating the curriculumEvaluating the curriculum
Evaluating the curriculum
 
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
 
Outcomnes-based Education
Outcomnes-based EducationOutcomnes-based Education
Outcomnes-based Education
 
Assessment-of-learning
 Assessment-of-learning Assessment-of-learning
Assessment-of-learning
 

Recently uploaded

ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
PECB
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
Nguyen Thanh Tu Collection
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
WaniBasim
 
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Dr. Vinod Kumar Kanvaria
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
Jean Carlos Nunes Paixão
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
eBook.com.bd (প্রয়োজনীয় বাংলা বই)
 
A Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in EducationA Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in Education
Peter Windle
 
Life upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for studentLife upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for student
NgcHiNguyn25
 
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptxChapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
Mohd Adib Abd Muin, Senior Lecturer at Universiti Utara Malaysia
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
Academy of Science of South Africa
 
PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
AyyanKhan40
 
Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
chanes7
 
Assessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptxAssessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptx
Kavitha Krishnan
 
Types of Herbal Cosmetics its standardization.
Types of Herbal Cosmetics its standardization.Types of Herbal Cosmetics its standardization.
Types of Herbal Cosmetics its standardization.
Ashokrao Mane college of Pharmacy Peth-Vadgaon
 
Advanced Java[Extra Concepts, Not Difficult].docx
Advanced Java[Extra Concepts, Not Difficult].docxAdvanced Java[Extra Concepts, Not Difficult].docx
Advanced Java[Extra Concepts, Not Difficult].docx
adhitya5119
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
Celine George
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
taiba qazi
 
Hindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdfHindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdf
Dr. Mulla Adam Ali
 
How to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP ModuleHow to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP Module
Celine George
 
The simplified electron and muon model, Oscillating Spacetime: The Foundation...
The simplified electron and muon model, Oscillating Spacetime: The Foundation...The simplified electron and muon model, Oscillating Spacetime: The Foundation...
The simplified electron and muon model, Oscillating Spacetime: The Foundation...
RitikBhardwaj56
 

Recently uploaded (20)

ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
BÀI TẬP BỔ TRỢ TIẾNG ANH 8 CẢ NĂM - GLOBAL SUCCESS - NĂM HỌC 2023-2024 (CÓ FI...
 
Liberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdfLiberal Approach to the Study of Indian Politics.pdf
Liberal Approach to the Study of Indian Politics.pdf
 
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...
 
A Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdfA Independência da América Espanhola LAPBOOK.pdf
A Independência da América Espanhola LAPBOOK.pdf
 
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdfবাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
বাংলাদেশ অর্থনৈতিক সমীক্ষা (Economic Review) ২০২৪ UJS App.pdf
 
A Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in EducationA Strategic Approach: GenAI in Education
A Strategic Approach: GenAI in Education
 
Life upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for studentLife upper-Intermediate B2 Workbook for student
Life upper-Intermediate B2 Workbook for student
 
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptxChapter 4 - Islamic Financial Institutions in Malaysia.pptx
Chapter 4 - Islamic Financial Institutions in Malaysia.pptx
 
South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)South African Journal of Science: Writing with integrity workshop (2024)
South African Journal of Science: Writing with integrity workshop (2024)
 
PIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf IslamabadPIMS Job Advertisement 2024.pdf Islamabad
PIMS Job Advertisement 2024.pdf Islamabad
 
Digital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments UnitDigital Artifact 1 - 10VCD Environments Unit
Digital Artifact 1 - 10VCD Environments Unit
 
Assessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptxAssessment and Planning in Educational technology.pptx
Assessment and Planning in Educational technology.pptx
 
Types of Herbal Cosmetics its standardization.
Types of Herbal Cosmetics its standardization.Types of Herbal Cosmetics its standardization.
Types of Herbal Cosmetics its standardization.
 
Advanced Java[Extra Concepts, Not Difficult].docx
Advanced Java[Extra Concepts, Not Difficult].docxAdvanced Java[Extra Concepts, Not Difficult].docx
Advanced Java[Extra Concepts, Not Difficult].docx
 
How to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRMHow to Manage Your Lost Opportunities in Odoo 17 CRM
How to Manage Your Lost Opportunities in Odoo 17 CRM
 
DRUGS AND ITS classification slide share
DRUGS AND ITS classification slide shareDRUGS AND ITS classification slide share
DRUGS AND ITS classification slide share
 
Hindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdfHindi varnamala | hindi alphabet PPT.pdf
Hindi varnamala | hindi alphabet PPT.pdf
 
How to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP ModuleHow to Add Chatter in the odoo 17 ERP Module
How to Add Chatter in the odoo 17 ERP Module
 
The simplified electron and muon model, Oscillating Spacetime: The Foundation...
The simplified electron and muon model, Oscillating Spacetime: The Foundation...The simplified electron and muon model, Oscillating Spacetime: The Foundation...
The simplified electron and muon model, Oscillating Spacetime: The Foundation...
 

Test-Construction-B (1).pptx

  • 1. Test Construction How to Write Effective Test Questions In-Service Training for Teachers April 13-14, 2020 SDO Lucena
  • 2. 1.Discuss the steps in developing table of specification 2.Construct a table of specification Objectives:
  • 4. Activity: Arrange the following steps in preparing the table of specification used by the test constructor. Make a two- way chart of a table of specification Make an outline of the subject matter to be covered in the test Construct the test items Select the learning outcomes to be measured Decide on the number of items per subtopic
  • 5. Philippine Professional Standards for Teachers (PPST) Domain 5: Assessment and Reporting Domain 5 relates to processes associated with a variety of assessment tools and strategies used by teachers in monitoring, evaluating, documenting and reporting learners’ needs, progress and achievement. This Domain concerns the use of assessment data in a variety of ways to inform and enhance the teaching and learning process and programs. It concerns teachers providing learners with the necessary feedback about learning outcomes. This feedback informs the reporting cycle and enables teachers to select, organize and use sound assessment processes.
  • 6. Domain 5, Assessment and Reporting, is composed of five strands: 1. Design, selection, organization and utilization of assessment strategies 2. Monitoring and evaluation of learner progress and achievement 3. Feedback to improve learning 4. Communication of learner needs, progress and achievement to key stakeholders 5. Use of assessment data to enhance teaching and learning practices and programs
  • 7. Domain 5, Assessment and Reporting, is composed of five strands: 1.Design, selection, organization and utilization of assessment strategies
  • 8. Table of Specification • A chart or table that details the content and level of cognitive assessed on a test as well as the types and emphases of test of items • Very important in addressing the validity and reliability of the test items • Provides the test constructor a way to ensure that the assessment is based on the intended learning outcomes • A way of ensuring that the number of questions on the test is adequate to ensure dependable results that are not likely caused by chance • A useful guide in constructing a test and in determining the type of test items that you need to construct
  • 9. DO 79, S. 2003 – ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS AMENDED BY DO 82, S. 2003 – AMENDMENT OF DEPED ORDER NO. 79 S. 2003 ON ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS 1. This Department, responding to the need for an assessment and evaluation system that truly reflects student performance, issues the following guidelines in the assessment and reporting of students’ progress: 1.1 Grades shall not be computed on the basis of any transmutation table that equates zero to a pre-selected base (such as 50 or 70) and adjusts other scores accordingly. 1.2 Grades shall be based on assessment that covers the range of learning competencies specified in the Philippine Elementary Learning Competencies (PELC) and Philippine Secondary Schools Learning Competencies (PSSLC). The test shall be designed as follows: - 60% easy items focused on basic content and skills expected of a student in each grade or year level; -30% medium-level items focused on higher level skills; and -10% difficult items focused on desirable content or skills that aim to distinguish the fast learners.
  • 10. DO 33, s 2004 - Implementing Guidelines on the Performance-Based Grading System for SY 2004-2005 3. In assessing learning outcomes, the construction of the test design should consist of 60% basic items, 30% more advanced items and 10% items for distinguishing honor students. Questions in each category should have different weights. Test and non-test items should cover only materials actually taken up in class. Factual information (easy) – 60% Moderately difficult (average) – 30% Higher order thinking skills (difficult) – 10%
  • 12. BLOOM’S REVISED TAXONOMY (Anderson and Krathwolh) 1956 2001
  • 13. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 1. Knowledge: Remembering or retrieving previously learned material. Examples of verbs that relate to this function are: identify, relate. List. Define, recall, memorize, repeat, record, name, recognize, acquire 1. Remembering: Objectives written on the remembering level – retrieving, recalling, or recognizing knowledge from memory. Remembering is when memory is used to produce definitions, facts, or lists; to recite or retrieve material. Sample verbs: state, tell, underline, locate, match, state, spell, fill in the blank, identify, relate, list, define, recall, memorize, repeat, record, name, recognize, acquire
  • 14. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 2. Comprehension: The ability to grasp or construct meaning from material. Examples of verbs: restate, locate, report, recognize, explain, express, identify, discuss, describe, review, infer, conclude, illustrate, interpret, draw, represent, differentiate 2. Understanding: Constructing meaning from different types of functions be they written or graphic message activities like interpreting, exemplifying, classifying, summarizing, inferring, comparing and explaining. Sample verbs: restate, locate, report, recognize, explain, express, identify, discuss, describe, review, infer, conclude, illustrate, interpret, draw, represent, differentiate
  • 15. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 3. Application: The ability to use learned material, or to implement material in new and concrete situations. Examples of verbs: apply, relate, develop, translate, use, operate, organize, employ, restructure, interpret, demonstrate, illustrate, practice, calculate, show, exhibit, dramatize 3. Applying: Carrying out or using a procedure through executing, or implementing. Applying relates and refers to situations where learned material is used through products like models, presentations, interviews or simulations. Sample verbs: apply, relate, develop, translate, use, operate, organize, employ, restructure, interpret, demonstrate, illustrate, practice, calculate, show, exhibit, dramatize
  • 16. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 4. Analysis: The ability to break down or distinguish the parts of the material into their components so that their organizational structure may be better understood. Examples of verbs: analyze, compare, probe, inquire, examine, contrast, categorize, differentiate, investigate, detect, survey, classify, deduce, experiment, scrutinize, discover, inspect, dissect, discriminate, separate 4. Analyzing: Breaking material or concepts into parts, determining how the parts relate or interrelate to one another or to an overall structure or purpose. Mental actions include in this function are differentiating, organizing and attributing, as well as being able to distinguish between the components or parts. When one is analyzing, he/she can illustrate this mental function by creating spreadsheets, surveys, charts, or diagrams, or graphic representations.
  • 17. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 5. Synthesis: The ability to put parts together to form a coherent or unique new whole. Examples of verbs: produce, design, assemble, create, prepare, predict, modify, plan, invent, formulate, collect, set up, generalize, document, combine, propose, develop, arrange, construct, organize, originate, derive, write 5. Evaluating: Making judgments based on criteria and standards through checking and critiquing. Critiques, recommendations, and reports are some of the products that can be created to demonstrate the processes of evaluation. Sample verbs: appraise, choose, compare, conclude, decide, defend, evaluate, give your opinion, judge, justify, prioritize, rank, rate, select, support, value
  • 18. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 6. Evaluation: The ability to judge, check, and even critique the value of material for a given purpose. Examples of verbs: judge, assess, compare, evaluate, conclude, measure, deduce, argue, decide, choose, rate, select, estimate, validate, consider, appraise, value, criticize, infer 6. Creating: Putting elements together to form a coherent or functional whole; reorganizing elements into a new pattern or structure through generating, planning, or producing. Creating requires users to put parts together in a new way or synthesize parts into something new and different form or product. This process is the most difficult mental function in the new taxonomy. Sample verbs – change, combine, compose, construct, create, invent, design, formulate, generate, produce, revise, reconstruct, rearrange, visualize, write, plan
  • 19.
  • 20. Table of Specification Learning Competency Number of Days Number of Items Item Placement Cognitive Level Remembering Understanding (60%) Easy Applying (30%) Average Analyzing Evaluating Creating (10%) Difficult Basic Concepts of Fractions 1 5 1-5 Addition of Fractions 1 5 6-10 Subtraction of Fractions 1 5 11-15 Multiplication and Division of Fractions 3 15 16-30 31-40 Application/ Problem Solving 4 20 41-45 46-50 Total 10 50 30 15 5
  • 21. How to Determine the No. of Items? Formula: No. of items = 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑥 𝑑𝑒𝑠𝑖𝑟𝑒𝑑 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 Example: Learning Competency: Multiplication and Division of Fractions Number of days: 3 Desired no. of items: 50 Total no. of class sessions: 10 No. of items = 3 𝑥 50 10 = 15
  • 22. Check your understanding: Directions: Complete the table by supplying the no. of items for each learning competency. Learning Competency No. of Class Sessions No. of Items Musculo-Skeletal System 2 Integumentary System 2 Digestive System 3 Respiratory System 3 Circulatory System 4 Total 14
  • 23. Workshop – Making Table of Specification Using your Curriculum Guide, make a table of specification for Periodic Test per subject area in each quarter.
  • 24.
  • 26. 1. Identify the different rules in constructing multiple choice test 2. Construct multiple choice test Objectives:
  • 27.
  • 28. Research indicates . . . • Teachers tend to use tests that they have prepared themselves much more often than any other type of test. (How Teaching Matters, National Council for Accreditation of Teacher Education, Oct. 2000) • While assessment options are diverse, most classroom educators rely on text and curriculum-embedded questions, and tests that are overwhelmingly classified as paper-and-pencil (National Commission on Teaching and America’s Future, 1996)
  • 29. Research indicates . . . • Formal training in paper-and-pencil test construction may occur at the preservice level (52% of the time) or as in-service preparation (21%). A significant number of professional educators (48%) report no formal training in developing, administering, scoring, and interpreting tests (Education Week, “National Survey of Public School Teachers, 2000). • Students report a higher level of test anxiety over teacher-made tests (64%) than over standardized tests (30%). The top three reasons why: poor test construction, irrelevant or obscure material coverage, and unclear directions. (NCATE, “Summary Data on Teacher Effectiveness, Teacher Quality, and Teacher Qualifications”, 2001)
  • 30. Two General Categories of Test Items 1. Objective items which require students to select the correct response from several alternatives or to supply a word or short phrase to answer a question or complete a statement. Objective items include: multiple choice, true-false, matching, completion 2. Subjective or essay items which permit the student to organize and present an original answer. Subjective items include: short-answer essay, extended-response essay, problem solving, performance test items
  • 31. Creating a test is one of the most challenging tasks confronting a teacher. Unfortunately, many of us have had little, if any, preparation in writing tests.
  • 32. What makes a test good or bad? The most basic and obvious answer to that question is that good tests measure what you want to measure, and bad tests do not.
  • 33. When to use objective tests? Objective tests are appropriate when: The group to be tested is large and the test may be reused. Highly reliable scores must be obtained as efficiently as possible. Impartiality of evaluation, fairness, and freedom from possible test scoring influences are essential.
  • 34. When to use objective tests? Objective tests can be used to: Measure almost any important educational achievement a written test can measure. Test understanding and ability to apply principles. Test ability to think critically. Test ability to solve problems
  • 35. The matching of learning objective expectations with certain item types provides a high degree of test validity: testing what is supposed to be tested.
  • 36. Matching Learning Objectives with Test Items Directions: Below are four test items categories labeled A, B, C, and D. Match the learning objectives with the most appropriate test item category. A – Objective test Item (MC, true-false, matching) B – Performance Test Item C – Essay Test Item (extended response) D - Essay Test Item (short answer) 1. Name the parts of the human skeleton. Answer: A 2. Appraise a composition on the basis of its organization. Answer: C 3. Demonstrate safe laboratory skills. Answer: B
  • 37. Matching Learning Objectives with Test Items Directions: Below are four test items categories labeled A, B, C, and D. Match the learning objectives with the most appropriate test item category. A – Objective test Item (MC, true-false, matching) B – Performance Test Item C – Essay Test Item (extended response) D - Essay Test Item (short answer) 4. Cite four examples of satire that Twain uses in Huckleberry Finn. Answer: D 5. Design a logo for a web page. Answer: B 6. Describe the impact of a bull market. Answer: C 7. Diagnose a physical ailment. Answer: B
  • 38. Matching Learning Objectives with Test Items Directions: Below are four test items categories labeled A, B, C, and D. Match the learning objectives with the most appropriate test item category. A – Objective test Item (MC, true-false, matching) B – Performance Test Item C – Essay Test Item (extended response) D - Essay Test Item (short answer) 8. List important mental attributes necessary for an athlete. Answer: D 9. Categorize great American fiction writers. Answer: A 10. Analyze the major causes of learning disabilities. Answer: C
  • 39. In general, test items should . . . •Assess achievement of instructional objectives •Measure important aspects of the subject (concepts and conceptual relations) •Accurately reflect the emphasis placed on important aspects of instruction •Measure an appropriate level of student knowledge •Vary in levels of difficulty
  • 40. Technical Quality of a Test 1. Cognitive Complexity The test questions will focus on appropriate intellectual activity ranging from simple recall of facts to problem solving, critical thinking, and reasoning. 2. Content Quality The test questions will permit students to demonstrate their knowledge of challenging and important subject matter. 3. Meaningfulness The test questions will be worth students’ time and students will recognize and understand their value.
  • 41. Technical Quality of a Test 4. Language Appropriateness The language demands will be clear and appropriate to the assessment tasks and to students. 5. Transfer and Generalizability Successful performance on the test will allow valid generalizations about achievement to be made.
  • 42. Technical Quality of a Test 6. Fairness Student performance will be measured in a way that does not give advantage to factors irrelevant to school learning; scoring schemes will be similarly equitable. 7. Reliability Answers to test questions will be consistently trusted to represent what students know.
  • 43.
  • 44. Activity: Piliin Mo Ako! Directions: Choose the letter of the best answer.
  • 45. Question 1: Multiple choice items provide highly reliable test scores because: A. They do not place a high degree of dependence on the students reading ability B. They place high degree of dependence on a teacher’s writing ability C. They are subjective measurement of student achievement D. They allow a wide sampling of content and a reduce guessing factor Answer: D
  • 46. Question 2: You should: A. Always decide on an answer before reading the alternatives B. Always review your marked exams C. Never change an answer D. Always do the multiple choice items on an exam first Answer: B
  • 47. Question 3: The multiple choice item on the right is structurally undesirable because: A. A direct question is more desirable than an incomplete statement B. There is no explicit problem of information in the stem C. The alternatives are not all plausible D. All of the above E. A & B only F. B & C only G. A & C only H. None of the above Answer: D You should: A. Always decide on an answer before reading the alternatives B. Always review your marked exams C. Never change an answer D. Always do the multiple choice items on an exam first
  • 48. Question 4: Question 3 multiple choice item on the right is undesirable because: A. It relies on an answer required in a previous item B. The stem does not supply enough information C. Eight alternatives are too many and too confusing to the students D. More alternatives just encourage guessing Answer: C Question 3: The multiple choice item on the right is structurally undesirable because: A. A direct question is more desirable than an incomplete statement B. There is no explicit problem of information in the stem C. The alternatives are not all plausible D. All of the above E. A & B only F. B & C only G. A & C only H. None of the above
  • 49. Question 5 The right answers in multiple choice questions tend to be: A. Longer and more descriptive B. The same length as the wrong answer C. At least a paragraph long D. Short Answer: A
  • 50. Question 6 When guessing on a multiple choice question with numbers in the answer A. Always pick the most extreme B. Pick the lowest range C. Pick answers in the middle range D. Always pick C Answer: C
  • 51. Question 7: What is the process of elimination in a multiple choice question? A. Skipping the entire question B. Eliminating all answers with extreme modifiers C. Just guessing D. Eliminating the wrong answers Answer: D
  • 52. Question 8 It is unlikely that a student who is unskilled in untangling negative statements will: A. Quickly understand multiple choice items not written in this way B. Not quickly understand multiple choice items not written in this way C. Quickly understand multiple choice items written in this way D. Not quickly understand multiple choice items written in this way Answer: C
  • 53. Multiple Choice Test Items MC item consist of the stem, which identifies the question or problem and the response alternatives or choices. Usually, students are asked to select the one alternative that best completes a statement or answer a question. Item Stem: Which of the following is a chemical change? Response Alternatives: A. Evaporation of alcohol B. Freezing of water C. Burning of oil D. Melting of wax
  • 54. General Guidelines in Constructing MC Test 1. Make the test item that is practical or with real-world applications to the students. 2. Use diagrams or drawings when asking questions about application, analysis or evaluation. 3. When ask to interpret or evaluate about quotations, present actual quotations from secondary sources like published books or newspapers. 4. Use tables, figures, or charts when asking questions to interpret. 5. Use pictures if possible when students are required to apply concepts and principles.
  • 55. General Guidelines in Constructing MC Test 6. List the choices/options vertically not horizontally. 7. Avoid trivial questions. 8. Use only one correct answer or best answer format. 9. Use three to five options to discourage guessing. 10. Be sure that distracters are plausible and effective. 11. Increase the similarity of the options to increase the difficulty of the item. 12. Do not use “none of the above” options when asking for a best answer. 13. Avoid using “all of the above” options. It is usually the correct answer and makes the item too easy for the examinee with partial knowledge.
  • 56. Guidelines in Constructing the Stem 1. The stem should be written in question form or completion form. Research showed that it is more advisable to use question form. 2. Do not leave the blank at the beginning or at the middle of the stem when using completion form of multiple-choice type of test. 3. The stem should pose the problem completely. 4. The stem should be clear and concise. 5. Avoid excessive and meaningless use of words in the stem. 6. State the stem in positive form. Avoid using the negative phrase like “not” or “except.” Underline or capitalize the negative words if it can be avoided. Example: Which of the following does not belong to the group? or Which of the following does NOT belong to the group? 7. Avoid grammatical clues in the correct answer.
  • 57. Guidelines in Constructing Options 1. There should be one correct or best answer in each item. 2. List options in vertical order not horizontal order beneath the stem. 3. Arrange the options in logical order and use capital letters to indicate each option such as A, B, C, D, E. 4. No overlapping options; keep it independent. 5. All options must be homogenous in content to increase the difficulty of an item. 6. As much as possible the length of the options must be the same or equal. 7. Avoid using the phrase “all of the above.” 8. Avoid using the phrase “none of the above” or “I don’t know.”
  • 58. Guidelines in Constructing the Distracters 1. The distracters should be plausible. 2. The distracters should be equally popular to all examinees. 3. Avoid using ineffective distracters. Replace distracter(s) that are not effective to the examinees. 4. Each distracter should be chosen by at least 5% but not more than the key answer. 5. Revise distracter(s) that are over attractive to the teachers. They might be ambiguous to the examinees.
  • 59. Advantages of MC Test 1. Measures learning outcomes from the knowledge to evaluation level. 2. Scoring is highly objective, easy and reliable. 3. Scores are more reliable than subjective type of test. 4. Measures broad samples of content within a short span of time. 5. Distracters can provide diagnostic information. 6. Item analysis can reveal the difficulty of an item and can discriminate the good and performing students.
  • 60. Disadvantages of MC Test 1. Time consuming to construct a good item. 2. Difficult to find effective and plausible distracters. 3. Scores can be influenced by the reading ability of the examinees. 4. In some cases, there is more than one justifiable correct answer. 5. Ineffectiveness in assessing the problem solving skills of the students. 6. Not applicable when assessing the students’ ability to organize and express ideas.
  • 61. Activity: Improve Mo Ako! Directions: The following multiple choice questions are poorly constructed. Write a better version of the question.
  • 68. Item 7 Poor Item Better Item
  • 69. Item 8 Poor Item Better Item
  • 70. Item 9 Poor Item Better Item
  • 71. Item 10 Poor Item Better Item
  • 72. “Understand that there is always one clearly best answer. Your goal is not to trick students or require them to make difficult judgments about two options that are nearly equally correct. Your goal is to design questions that students who understand will answer correctly and students who do not understands will answer incorrectly.” John A. Johnson Dept. of Psychology Penn State University
  • 73. POINTS TO PONDER. . . A good lesson makes a good question A good question makes a good content A good content makes a good test A good test makes a good grade A good grade makes a good student A good student makes a good COMMUNITY Jesus Ochave Ph.D. VP Research Planning and Development Philippine Normal University