SlideShare a Scribd company logo
1 of 74
Test Construction
How to Write Effective Test
Questions
In-Service Training for Teachers
April 13-14, 2020
SDO Lucena
1.Discuss the steps in developing
table of specification
2.Construct a table of specification
Objectives:
TABLE OF SPECIFICATION
Activity: Arrange the following steps in preparing
the table of specification used by the test
constructor.
Make a two- way chart of a table of specification
Make an outline of the subject matter to be
covered in the test
Construct the test items
Select the learning outcomes to be
measured
Decide on the number of items per subtopic
Philippine Professional Standards for Teachers (PPST)
Domain 5: Assessment and Reporting
Domain 5 relates to processes associated with a variety of
assessment tools and strategies used by teachers in
monitoring, evaluating, documenting and reporting learners’
needs, progress and achievement. This Domain concerns the
use of assessment data in a variety of ways to inform and
enhance the teaching and learning process and programs. It
concerns teachers providing learners with the necessary
feedback about learning outcomes. This feedback informs the
reporting cycle and enables teachers to select, organize and
use sound assessment processes.
Domain 5, Assessment and Reporting, is composed of five
strands:
1. Design, selection, organization and utilization of assessment
strategies
2. Monitoring and evaluation of learner progress and
achievement
3. Feedback to improve learning
4. Communication of learner needs, progress and achievement
to key stakeholders
5. Use of assessment data to enhance teaching and learning
practices and programs
Domain 5, Assessment and Reporting, is composed of five
strands:
1.Design, selection,
organization and utilization
of assessment strategies
Table of Specification
• A chart or table that details the content and level of cognitive
assessed on a test as well as the types and emphases of test of items
• Very important in addressing the validity and reliability of the test
items
• Provides the test constructor a way to ensure that the assessment is
based on the intended learning outcomes
• A way of ensuring that the number of questions on the test is
adequate to ensure dependable results that are not likely caused by
chance
• A useful guide in constructing a test and in determining the type of
test items that you need to construct
DO 79, S. 2003 – ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF
STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS
AMENDED BY DO 82, S. 2003 – AMENDMENT OF DEPED ORDER NO. 79 S. 2003 ON
ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’
PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS
1. This Department, responding to the need for an assessment and evaluation system that truly
reflects student performance, issues the following guidelines in the assessment and reporting of
students’ progress:
1.1 Grades shall not be computed on the basis of any transmutation table that equates zero to a
pre-selected base (such as 50 or 70) and adjusts other scores accordingly.
1.2 Grades shall be based on assessment that covers the range of learning competencies
specified in the Philippine Elementary Learning Competencies (PELC) and Philippine Secondary
Schools Learning Competencies (PSSLC). The test shall be designed as follows:
- 60% easy items focused on basic content and skills expected of a student in
each grade or year level;
-30% medium-level items focused on higher level skills; and
-10% difficult items focused on desirable content or skills that aim to distinguish the fast learners.
DO 33, s 2004 - Implementing Guidelines on the
Performance-Based Grading System for SY 2004-2005
3. In assessing learning outcomes, the construction of the test
design should consist of 60% basic items, 30% more advanced
items and 10% items for distinguishing honor students.
Questions in each category should have different weights. Test
and non-test items should cover only materials actually taken up
in class.
Factual information (easy) – 60%
Moderately difficult (average) – 30%
Higher order thinking skills (difficult) – 10%
Page 142 of
RPMS
Manual
Assessment
Objective MOV
BLOOM’S REVISED TAXONOMY (Anderson and Krathwolh)
1956 2001
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
1. Knowledge: Remembering or
retrieving previously learned
material.
Examples of verbs that relate to this
function are: identify, relate. List.
Define, recall, memorize, repeat,
record, name, recognize, acquire
1. Remembering: Objectives written
on the remembering level –
retrieving, recalling, or recognizing
knowledge from memory.
Remembering is when memory is
used to produce definitions, facts,
or lists; to recite or retrieve
material.
Sample verbs: state, tell, underline,
locate, match, state, spell, fill in the
blank, identify, relate, list, define,
recall, memorize, repeat, record,
name, recognize, acquire
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
2. Comprehension: The ability to
grasp or construct meaning from
material.
Examples of verbs: restate, locate,
report, recognize, explain, express,
identify, discuss, describe, review,
infer, conclude, illustrate, interpret,
draw, represent, differentiate
2. Understanding: Constructing
meaning from different types of
functions be they written or graphic
message activities like interpreting,
exemplifying, classifying,
summarizing, inferring, comparing
and explaining.
Sample verbs: restate, locate, report,
recognize, explain, express, identify,
discuss, describe, review, infer,
conclude, illustrate, interpret, draw,
represent, differentiate
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
3. Application: The ability to use
learned material, or to implement
material in new and concrete
situations.
Examples of verbs: apply, relate,
develop, translate, use, operate,
organize, employ, restructure,
interpret, demonstrate, illustrate,
practice, calculate, show, exhibit,
dramatize
3. Applying: Carrying out or using a
procedure through executing, or
implementing. Applying relates and
refers to situations where learned
material is used through products
like models, presentations,
interviews or simulations.
Sample verbs: apply, relate, develop,
translate, use, operate, organize,
employ, restructure, interpret,
demonstrate, illustrate, practice,
calculate, show, exhibit, dramatize
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
4. Analysis: The ability to break down
or distinguish the parts of the
material into their components so
that their organizational structure
may be better understood.
Examples of verbs: analyze, compare,
probe, inquire, examine, contrast,
categorize, differentiate, investigate,
detect, survey, classify, deduce,
experiment, scrutinize, discover,
inspect, dissect, discriminate,
separate
4. Analyzing: Breaking material or
concepts into parts, determining how
the parts relate or interrelate to one
another or to an overall structure or
purpose. Mental actions include in
this function are differentiating,
organizing and attributing, as well as
being able to distinguish between
the components or parts. When one
is analyzing, he/she can illustrate this
mental function by creating
spreadsheets, surveys, charts, or
diagrams, or graphic representations.
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
5. Synthesis: The ability to put parts
together to form a coherent or
unique new whole.
Examples of verbs: produce, design,
assemble, create, prepare, predict,
modify, plan, invent, formulate,
collect, set up, generalize, document,
combine, propose, develop, arrange,
construct, organize, originate, derive,
write
5. Evaluating: Making judgments
based on criteria and standards
through checking and critiquing.
Critiques, recommendations, and
reports are some of the products
that can be created to demonstrate
the processes of evaluation.
Sample verbs: appraise, choose,
compare, conclude, decide, defend,
evaluate, give your opinion, judge,
justify, prioritize, rank, rate, select,
support, value
Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in
2001
6. Evaluation: The ability to judge,
check, and even critique the value of
material for a given purpose.
Examples of verbs: judge, assess,
compare, evaluate, conclude,
measure, deduce, argue, decide,
choose, rate, select, estimate,
validate, consider, appraise, value,
criticize, infer
6. Creating: Putting elements together to
form a coherent or functional whole;
reorganizing elements into a new pattern
or structure through generating, planning,
or producing. Creating requires users to put
parts together in a new way or synthesize
parts into something new and different
form or product. This process is the most
difficult mental function in the new
taxonomy. Sample verbs – change,
combine, compose, construct, create,
invent, design, formulate, generate,
produce, revise, reconstruct, rearrange,
visualize, write, plan
Table of Specification
Learning
Competency
Number of
Days
Number of
Items
Item Placement
Cognitive Level
Remembering
Understanding
(60%)
Easy
Applying
(30%)
Average
Analyzing
Evaluating
Creating
(10%)
Difficult
Basic Concepts of
Fractions
1 5 1-5
Addition of Fractions 1 5 6-10
Subtraction of Fractions 1 5 11-15
Multiplication and
Division of Fractions
3 15 16-30 31-40
Application/
Problem Solving
4 20 41-45 46-50
Total 10 50 30 15 5
How to Determine the No. of Items?
Formula:
No. of items =
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑥 𝑑𝑒𝑠𝑖𝑟𝑒𝑑 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠
𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠
Example:
Learning Competency: Multiplication and Division of Fractions
Number of days: 3
Desired no. of items: 50
Total no. of class sessions: 10
No. of items =
3 𝑥 50
10
= 15
Check your understanding:
Directions: Complete the table by supplying the no. of items for each
learning competency.
Learning Competency No. of Class Sessions No. of Items
Musculo-Skeletal System 2
Integumentary System 2
Digestive System 3
Respiratory System 3
Circulatory System 4
Total 14
Workshop – Making Table of Specification
Using your Curriculum Guide, make a
table of specification for Periodic Test per
subject area in each quarter.
Test Construction
1. Identify the different rules in constructing
multiple choice test
2. Construct multiple choice test
Objectives:
Research indicates . . .
• Teachers tend to use tests that they have prepared
themselves much more often than any other type of
test. (How Teaching Matters, National Council for
Accreditation of Teacher Education, Oct. 2000)
• While assessment options are diverse, most classroom
educators rely on text and curriculum-embedded
questions, and tests that are overwhelmingly classified
as paper-and-pencil (National Commission on Teaching
and America’s Future, 1996)
Research indicates . . .
• Formal training in paper-and-pencil test construction may occur
at the preservice level (52% of the time) or as in-service
preparation (21%). A significant number of professional
educators (48%) report no formal training in developing,
administering, scoring, and interpreting tests (Education Week,
“National Survey of Public School Teachers, 2000).
• Students report a higher level of test anxiety over teacher-made
tests (64%) than over standardized tests (30%). The top three
reasons why: poor test construction, irrelevant or
obscure material coverage, and unclear directions.
(NCATE, “Summary Data on Teacher Effectiveness, Teacher
Quality, and Teacher Qualifications”, 2001)
Two General Categories of Test Items
1. Objective items which require students to select the
correct response from several alternatives or to supply
a word or short phrase to answer a question or
complete a statement. Objective items include:
multiple choice, true-false, matching, completion
2. Subjective or essay items which permit the student to
organize and present an original answer. Subjective
items include: short-answer essay, extended-response
essay, problem solving, performance test items
Creating a test is one of the most
challenging tasks confronting a
teacher.
Unfortunately, many of
us have had little, if any,
preparation in writing
tests.
What makes a test good or bad?
The most basic and
obvious answer to that
question is that good tests
measure what you want to
measure, and bad tests do
not.
When to use objective tests?
Objective tests are appropriate when:
The group to be tested is large and the test may
be reused.
Highly reliable scores must be obtained as
efficiently as possible.
Impartiality of evaluation, fairness, and
freedom from possible test scoring influences
are essential.
When to use objective tests?
Objective tests can be used to:
Measure almost any important educational
achievement a written test can measure.
Test understanding and ability to apply
principles.
Test ability to think critically.
Test ability to solve problems
The matching of
learning objective
expectations with
certain item types
provides a high
degree of test
validity: testing what
is supposed to be
tested.
Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
1. Name the parts of the human skeleton.
Answer: A
2. Appraise a composition on the basis of its organization.
Answer: C
3. Demonstrate safe laboratory skills.
Answer: B
Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
4. Cite four examples of satire that Twain uses in Huckleberry Finn.
Answer: D
5. Design a logo for a web page.
Answer: B
6. Describe the impact of a bull market.
Answer: C
7. Diagnose a physical ailment.
Answer: B
Matching Learning Objectives with Test Items
Directions: Below are four test items categories labeled A, B, C, and D. Match the
learning objectives with the most appropriate test item category.
A – Objective test Item (MC, true-false, matching)
B – Performance Test Item
C – Essay Test Item (extended response)
D - Essay Test Item (short answer)
8. List important mental attributes necessary for an athlete.
Answer: D
9. Categorize great American fiction writers.
Answer: A
10. Analyze the major causes of learning disabilities.
Answer: C
In general, test items should . . .
•Assess achievement of instructional objectives
•Measure important aspects of the subject
(concepts and conceptual relations)
•Accurately reflect the emphasis placed on
important aspects of instruction
•Measure an appropriate level of student
knowledge
•Vary in levels of difficulty
Technical Quality of a Test
1. Cognitive Complexity
The test questions will focus on appropriate intellectual
activity ranging from simple recall of facts to problem
solving, critical thinking, and reasoning.
2. Content Quality
The test questions will permit students to demonstrate their
knowledge of challenging and important subject matter.
3. Meaningfulness
The test questions will be worth students’ time and students
will recognize and understand their value.
Technical Quality of a Test
4. Language Appropriateness
The language demands will be clear and
appropriate to the assessment tasks and to
students.
5. Transfer and Generalizability
Successful performance on the test will allow
valid generalizations about achievement to be
made.
Technical Quality of a Test
6. Fairness
Student performance will be measured in a way
that does not give advantage to factors
irrelevant to school learning; scoring schemes
will be similarly equitable.
7. Reliability
Answers to test questions will be consistently
trusted to represent what students know.
Activity: Piliin Mo Ako!
Directions: Choose the letter of the best
answer.
Question 1:
Multiple choice items provide highly reliable test scores because:
A. They do not place a high degree of dependence on the students
reading ability
B. They place high degree of dependence on a teacher’s writing ability
C. They are subjective measurement of student achievement
D. They allow a wide sampling of content and a reduce guessing factor
Answer: D
Question 2:
You should:
A. Always decide on an answer before reading the alternatives
B. Always review your marked exams
C. Never change an answer
D. Always do the multiple choice items on an exam first
Answer: B
Question 3:
The multiple choice item on the right is
structurally undesirable because:
A. A direct question is more desirable than
an incomplete statement
B. There is no explicit problem of
information in the stem
C. The alternatives are not all plausible
D. All of the above
E. A & B only
F. B & C only
G. A & C only
H. None of the above
Answer: D
You should:
A. Always decide on an answer
before reading the
alternatives
B. Always review your marked
exams
C. Never change an answer
D. Always do the multiple
choice items on an exam first
Question 4:
Question 3 multiple choice item on the
right is undesirable because:
A. It relies on an answer required in a
previous item
B. The stem does not supply enough
information
C. Eight alternatives are too many and
too confusing to the students
D. More alternatives just encourage
guessing
Answer: C
Question 3:
The multiple choice item on the
right is structurally undesirable
because:
A. A direct question is more
desirable than an incomplete
statement
B. There is no explicit problem of
information in the stem
C. The alternatives are not all
plausible
D. All of the above
E. A & B only
F. B & C only
G. A & C only
H. None of the above
Question 5
The right answers in multiple choice questions tend to be:
A. Longer and more descriptive
B. The same length as the wrong answer
C. At least a paragraph long
D. Short
Answer: A
Question 6
When guessing on a multiple choice question with numbers in the
answer
A. Always pick the most extreme
B. Pick the lowest range
C. Pick answers in the middle range
D. Always pick C
Answer: C
Question 7:
What is the process of elimination in a multiple choice question?
A. Skipping the entire question
B. Eliminating all answers with extreme modifiers
C. Just guessing
D. Eliminating the wrong answers
Answer: D
Question 8
It is unlikely that a student who is unskilled in untangling negative
statements will:
A. Quickly understand multiple choice items not written in this way
B. Not quickly understand multiple choice items not written in this
way
C. Quickly understand multiple choice items written in this way
D. Not quickly understand multiple choice items written in this way
Answer: C
Multiple Choice Test Items
MC item consist of the stem, which identifies the question or problem
and the response alternatives or choices. Usually, students are asked to
select the one alternative that best completes a statement or answer a
question.
Item Stem: Which of the following is a chemical change?
Response Alternatives: A. Evaporation of alcohol
B. Freezing of water
C. Burning of oil
D. Melting of wax
General Guidelines in Constructing MC Test
1. Make the test item that is practical or with real-world
applications to the students.
2. Use diagrams or drawings when asking questions about
application, analysis or evaluation.
3. When ask to interpret or evaluate about quotations,
present actual quotations from secondary sources like
published books or newspapers.
4. Use tables, figures, or charts when asking questions to
interpret.
5. Use pictures if possible when students are required to
apply concepts and principles.
General Guidelines in Constructing MC Test
6. List the choices/options vertically not horizontally.
7. Avoid trivial questions.
8. Use only one correct answer or best answer format.
9. Use three to five options to discourage guessing.
10. Be sure that distracters are plausible and effective.
11. Increase the similarity of the options to increase the difficulty of
the item.
12. Do not use “none of the above” options when asking for a best
answer.
13. Avoid using “all of the above” options. It is usually the correct
answer and makes the item too easy for the examinee with partial
knowledge.
Guidelines in Constructing the Stem
1. The stem should be written in question form or completion form.
Research showed that it is more advisable to use question form.
2. Do not leave the blank at the beginning or at the middle of the stem when
using completion form of multiple-choice type of test.
3. The stem should pose the problem completely.
4. The stem should be clear and concise.
5. Avoid excessive and meaningless use of words in the stem.
6. State the stem in positive form. Avoid using the negative phrase like “not”
or “except.” Underline or capitalize the negative words if it can be avoided.
Example: Which of the following does not belong to the group? or Which
of the following does NOT belong to the group?
7. Avoid grammatical clues in the correct answer.
Guidelines in Constructing Options
1. There should be one correct or best answer in each item.
2. List options in vertical order not horizontal order beneath the stem.
3. Arrange the options in logical order and use capital letters to
indicate each option such as A, B, C, D, E.
4. No overlapping options; keep it independent.
5. All options must be homogenous in content to increase the
difficulty of an item.
6. As much as possible the length of the options must be the same or
equal.
7. Avoid using the phrase “all of the above.”
8. Avoid using the phrase “none of the above” or “I don’t know.”
Guidelines in Constructing the Distracters
1. The distracters should be plausible.
2. The distracters should be equally popular to all examinees.
3. Avoid using ineffective distracters. Replace distracter(s)
that are not effective to the examinees.
4. Each distracter should be chosen by at least 5% but not
more than the key answer.
5. Revise distracter(s) that are over attractive to the teachers.
They might be ambiguous to the examinees.
Advantages of MC Test
1. Measures learning outcomes from the knowledge to
evaluation level.
2. Scoring is highly objective, easy and reliable.
3. Scores are more reliable than subjective type of test.
4. Measures broad samples of content within a short
span of time.
5. Distracters can provide diagnostic information.
6. Item analysis can reveal the difficulty of an item and
can discriminate the good and performing students.
Disadvantages of MC Test
1. Time consuming to construct a good item.
2. Difficult to find effective and plausible distracters.
3. Scores can be influenced by the reading ability of the
examinees.
4. In some cases, there is more than one justifiable correct
answer.
5. Ineffectiveness in assessing the problem solving skills of
the students.
6. Not applicable when assessing the students’ ability to
organize and express ideas.
Activity: Improve Mo Ako!
Directions: The following multiple choice
questions are poorly constructed. Write a
better version of the question.
Item 1:
Item 2:
Item 3:
Item 4
Item 5
Item 6
Item 7
Poor Item Better Item
Item 8
Poor Item Better Item
Item 9
Poor Item Better Item
Item 10
Poor Item Better Item
“Understand that there is always one clearly best
answer. Your goal is not to trick students or require
them to make difficult judgments about two
options that are nearly equally correct. Your goal is
to design questions that students who understand
will answer correctly and students who do not
understands will answer incorrectly.”
John A. Johnson
Dept. of Psychology
Penn State University
POINTS TO PONDER. . .
A good lesson makes a good question
A good question makes a good content
A good content makes a good test
A good test makes a good grade
A good grade makes a good student
A good student makes a good COMMUNITY
Jesus Ochave Ph.D.
VP Research Planning and Development
Philippine Normal University
Test-Construction-B (1).pptx

More Related Content

What's hot

LAC IMPLEMENTATION PLAN SY 2022-23.docx
LAC IMPLEMENTATION PLAN SY 2022-23.docxLAC IMPLEMENTATION PLAN SY 2022-23.docx
LAC IMPLEMENTATION PLAN SY 2022-23.docxJenniferSimyunn3
 
RPMS-with-movs-and-annotations-for-printing.ppt
RPMS-with-movs-and-annotations-for-printing.pptRPMS-with-movs-and-annotations-for-printing.ppt
RPMS-with-movs-and-annotations-for-printing.pptMARIADELCORTEZ
 
Sciemath lac-plan-2021-2022
Sciemath lac-plan-2021-2022Sciemath lac-plan-2021-2022
Sciemath lac-plan-2021-2022TeodyGumabat
 
DepED Issuances on PPSSH
DepED Issuances on PPSSHDepED Issuances on PPSSH
DepED Issuances on PPSSHDivine Dizon
 
NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022
NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022
NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022Lindy Pujante
 
Preparation-and-Checking-of-School-Forms-SY2022-2023.pptx
Preparation-and-Checking-of-School-Forms-SY2022-2023.pptxPreparation-and-Checking-of-School-Forms-SY2022-2023.pptx
Preparation-and-Checking-of-School-Forms-SY2022-2023.pptxJoanaJallorina
 
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdfRPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdfGlenn Rivera
 
Philippine Professional Standards for Teachers
Philippine Professional Standards for TeachersPhilippine Professional Standards for Teachers
Philippine Professional Standards for TeachersJohn Adrian Adiaz
 
Introduction to SOLO taxonomy
Introduction to SOLO taxonomyIntroduction to SOLO taxonomy
Introduction to SOLO taxonomyDavid Didau
 
SMEA-FIRST-2022-2023.pptx
SMEA-FIRST-2022-2023.pptxSMEA-FIRST-2022-2023.pptx
SMEA-FIRST-2022-2023.pptxJoel Rodriguez
 
K to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentation
K to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentationK to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentation
K to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentationChuckry Maunes
 
Table of Specifications (TOS) and Test Construction Review
Table of Specifications (TOS) and Test Construction ReviewTable of Specifications (TOS) and Test Construction Review
Table of Specifications (TOS) and Test Construction ReviewRivera Arnel
 
DepEd School Governing Council (SGC) Orientation
DepEd School Governing Council (SGC) OrientationDepEd School Governing Council (SGC) Orientation
DepEd School Governing Council (SGC) OrientationSire Bryan Lancelot
 
Supervisory plan-and-report-2022
Supervisory plan-and-report-2022Supervisory plan-and-report-2022
Supervisory plan-and-report-2022ReyMarkVidalLacaden
 
Classroom Rules Orientation During the First Day of Class
Classroom Rules Orientation During the First Day of ClassClassroom Rules Orientation During the First Day of Class
Classroom Rules Orientation During the First Day of ClassSecondary School Teacher
 

What's hot (20)

Cot rpms-for-teacher-1-3
Cot rpms-for-teacher-1-3Cot rpms-for-teacher-1-3
Cot rpms-for-teacher-1-3
 
LAC IMPLEMENTATION PLAN SY 2022-23.docx
LAC IMPLEMENTATION PLAN SY 2022-23.docxLAC IMPLEMENTATION PLAN SY 2022-23.docx
LAC IMPLEMENTATION PLAN SY 2022-23.docx
 
RPMS-with-movs-and-annotations-for-printing.ppt
RPMS-with-movs-and-annotations-for-printing.pptRPMS-with-movs-and-annotations-for-printing.ppt
RPMS-with-movs-and-annotations-for-printing.ppt
 
Sciemath lac-plan-2021-2022
Sciemath lac-plan-2021-2022Sciemath lac-plan-2021-2022
Sciemath lac-plan-2021-2022
 
LAC PLAN_2022-2023.docx
LAC PLAN_2022-2023.docxLAC PLAN_2022-2023.docx
LAC PLAN_2022-2023.docx
 
DepED Issuances on PPSSH
DepED Issuances on PPSSHDepED Issuances on PPSSH
DepED Issuances on PPSSH
 
NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022
NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022
NAALAD ELEM SCHOOL IMPROVEMENT PLAN 2019-2022
 
Electronic Self-assessment Tool (e-SAT)
Electronic Self-assessmentTool (e-SAT)Electronic Self-assessmentTool (e-SAT)
Electronic Self-assessment Tool (e-SAT)
 
Test Construction
Test ConstructionTest Construction
Test Construction
 
Preparation-and-Checking-of-School-Forms-SY2022-2023.pptx
Preparation-and-Checking-of-School-Forms-SY2022-2023.pptxPreparation-and-Checking-of-School-Forms-SY2022-2023.pptx
Preparation-and-Checking-of-School-Forms-SY2022-2023.pptx
 
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdfRPMS IPCRF Memo 2023 DM_s2023_008.pdf
RPMS IPCRF Memo 2023 DM_s2023_008.pdf
 
Philippine Professional Standards for Teachers
Philippine Professional Standards for TeachersPhilippine Professional Standards for Teachers
Philippine Professional Standards for Teachers
 
Introduction to SOLO taxonomy
Introduction to SOLO taxonomyIntroduction to SOLO taxonomy
Introduction to SOLO taxonomy
 
SMEA-FIRST-2022-2023.pptx
SMEA-FIRST-2022-2023.pptxSMEA-FIRST-2022-2023.pptx
SMEA-FIRST-2022-2023.pptx
 
K to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentation
K to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentationK to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentation
K to 12 Grading Sheet Deped Order No. 8 S. 2015 PPT presentation
 
Table of Specifications (TOS) and Test Construction Review
Table of Specifications (TOS) and Test Construction ReviewTable of Specifications (TOS) and Test Construction Review
Table of Specifications (TOS) and Test Construction Review
 
DepEd School Governing Council (SGC) Orientation
DepEd School Governing Council (SGC) OrientationDepEd School Governing Council (SGC) Orientation
DepEd School Governing Council (SGC) Orientation
 
Supervisory plan-and-report-2022
Supervisory plan-and-report-2022Supervisory plan-and-report-2022
Supervisory plan-and-report-2022
 
RPMS2022-2023.pdf
RPMS2022-2023.pdfRPMS2022-2023.pdf
RPMS2022-2023.pdf
 
Classroom Rules Orientation During the First Day of Class
Classroom Rules Orientation During the First Day of ClassClassroom Rules Orientation During the First Day of Class
Classroom Rules Orientation During the First Day of Class
 

Similar to Test-Construction-B (1).pptx

Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116Abdelghani Qouhafa
 
Factors in constructing evaluative instruments
Factors in constructing evaluative instrumentsFactors in constructing evaluative instruments
Factors in constructing evaluative instrumentsCatherine Matias
 
Examination reform policy
Examination reform policy Examination reform policy
Examination reform policy Dr. Vishal Jain
 
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))paj261997
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment InstrumentsAngel Jones
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7cdjhaigler
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7cdjhaigler
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrumentcdjhaigler
 
Principles of Teaching for LET Reciew
Principles of Teaching for LET ReciewPrinciples of Teaching for LET Reciew
Principles of Teaching for LET ReciewKate Cast-Vallar
 
Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention Ijaz Ahmad
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instrumentsJCrawford62
 
Lesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfLesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfPrincessAngelMagbanu
 
Bloom’s taxonomy
Bloom’s taxonomyBloom’s taxonomy
Bloom’s taxonomyAtul Thakur
 
Outcomnes-based Education
Outcomnes-based EducationOutcomnes-based Education
Outcomnes-based EducationCarlo Magno
 
Assessment-of-learning
 Assessment-of-learning Assessment-of-learning
Assessment-of-learningaqosiAnn
 

Similar to Test-Construction-B (1).pptx (20)

Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116Writing good-multiple-choice-exams-fic-120116
Writing good-multiple-choice-exams-fic-120116
 
Factors in constructing evaluative instruments
Factors in constructing evaluative instrumentsFactors in constructing evaluative instruments
Factors in constructing evaluative instruments
 
Examination reform policy
Examination reform policy Examination reform policy
Examination reform policy
 
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
Performance-Based Assessment (Assessment of Learning 2, Chapter 2))
 
Developing Assessment Instruments
Developing Assessment InstrumentsDeveloping Assessment Instruments
Developing Assessment Instruments
 
Presentation2
Presentation2Presentation2
Presentation2
 
Apt 501 chapter_7
Apt 501 chapter_7Apt 501 chapter_7
Apt 501 chapter_7
 
Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7Developing Assessment Instruments Chapter 7
Developing Assessment Instruments Chapter 7
 
Developing Assessment Instrument
Developing Assessment InstrumentDeveloping Assessment Instrument
Developing Assessment Instrument
 
Principles of Teaching for LET Reciew
Principles of Teaching for LET ReciewPrinciples of Teaching for LET Reciew
Principles of Teaching for LET Reciew
 
Chapter 11
Chapter 11Chapter 11
Chapter 11
 
Lesson 1 bb.docx
Lesson 1 bb.docxLesson 1 bb.docx
Lesson 1 bb.docx
 
Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention Evaluation: Determining the Effect of the Intervention
Evaluation: Determining the Effect of the Intervention
 
Developing assessment instruments
Developing assessment instrumentsDeveloping assessment instruments
Developing assessment instruments
 
Lesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdfLesson3_Performance Assessment_March5 2024.pdf
Lesson3_Performance Assessment_March5 2024.pdf
 
Bloom’s taxonomy
Bloom’s taxonomyBloom’s taxonomy
Bloom’s taxonomy
 
Evaluating the curriculum
Evaluating the curriculumEvaluating the curriculum
Evaluating the curriculum
 
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
An Inventory of Conventional and Technology-Enabled Direct and Indirect Asses...
 
Outcomnes-based Education
Outcomnes-based EducationOutcomnes-based Education
Outcomnes-based Education
 
Assessment-of-learning
 Assessment-of-learning Assessment-of-learning
Assessment-of-learning
 

Recently uploaded

Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayMakMakNepo
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationAadityaSharma884161
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 

Recently uploaded (20)

Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Quarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up FridayQuarter 4 Peace-education.pptx Catch Up Friday
Quarter 4 Peace-education.pptx Catch Up Friday
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint Presentation
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 

Test-Construction-B (1).pptx

  • 1. Test Construction How to Write Effective Test Questions In-Service Training for Teachers April 13-14, 2020 SDO Lucena
  • 2. 1.Discuss the steps in developing table of specification 2.Construct a table of specification Objectives:
  • 4. Activity: Arrange the following steps in preparing the table of specification used by the test constructor. Make a two- way chart of a table of specification Make an outline of the subject matter to be covered in the test Construct the test items Select the learning outcomes to be measured Decide on the number of items per subtopic
  • 5. Philippine Professional Standards for Teachers (PPST) Domain 5: Assessment and Reporting Domain 5 relates to processes associated with a variety of assessment tools and strategies used by teachers in monitoring, evaluating, documenting and reporting learners’ needs, progress and achievement. This Domain concerns the use of assessment data in a variety of ways to inform and enhance the teaching and learning process and programs. It concerns teachers providing learners with the necessary feedback about learning outcomes. This feedback informs the reporting cycle and enables teachers to select, organize and use sound assessment processes.
  • 6. Domain 5, Assessment and Reporting, is composed of five strands: 1. Design, selection, organization and utilization of assessment strategies 2. Monitoring and evaluation of learner progress and achievement 3. Feedback to improve learning 4. Communication of learner needs, progress and achievement to key stakeholders 5. Use of assessment data to enhance teaching and learning practices and programs
  • 7. Domain 5, Assessment and Reporting, is composed of five strands: 1.Design, selection, organization and utilization of assessment strategies
  • 8. Table of Specification • A chart or table that details the content and level of cognitive assessed on a test as well as the types and emphases of test of items • Very important in addressing the validity and reliability of the test items • Provides the test constructor a way to ensure that the assessment is based on the intended learning outcomes • A way of ensuring that the number of questions on the test is adequate to ensure dependable results that are not likely caused by chance • A useful guide in constructing a test and in determining the type of test items that you need to construct
  • 9. DO 79, S. 2003 – ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS AMENDED BY DO 82, S. 2003 – AMENDMENT OF DEPED ORDER NO. 79 S. 2003 ON ASSESSMENT AND EVALUATION OF LEARNING AND REPORTING OF STUDENTS’ PROGRESS IN PUBLIC ELEMENTARY AND SECONDARY SCHOOLS 1. This Department, responding to the need for an assessment and evaluation system that truly reflects student performance, issues the following guidelines in the assessment and reporting of students’ progress: 1.1 Grades shall not be computed on the basis of any transmutation table that equates zero to a pre-selected base (such as 50 or 70) and adjusts other scores accordingly. 1.2 Grades shall be based on assessment that covers the range of learning competencies specified in the Philippine Elementary Learning Competencies (PELC) and Philippine Secondary Schools Learning Competencies (PSSLC). The test shall be designed as follows: - 60% easy items focused on basic content and skills expected of a student in each grade or year level; -30% medium-level items focused on higher level skills; and -10% difficult items focused on desirable content or skills that aim to distinguish the fast learners.
  • 10. DO 33, s 2004 - Implementing Guidelines on the Performance-Based Grading System for SY 2004-2005 3. In assessing learning outcomes, the construction of the test design should consist of 60% basic items, 30% more advanced items and 10% items for distinguishing honor students. Questions in each category should have different weights. Test and non-test items should cover only materials actually taken up in class. Factual information (easy) – 60% Moderately difficult (average) – 30% Higher order thinking skills (difficult) – 10%
  • 12. BLOOM’S REVISED TAXONOMY (Anderson and Krathwolh) 1956 2001
  • 13. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 1. Knowledge: Remembering or retrieving previously learned material. Examples of verbs that relate to this function are: identify, relate. List. Define, recall, memorize, repeat, record, name, recognize, acquire 1. Remembering: Objectives written on the remembering level – retrieving, recalling, or recognizing knowledge from memory. Remembering is when memory is used to produce definitions, facts, or lists; to recite or retrieve material. Sample verbs: state, tell, underline, locate, match, state, spell, fill in the blank, identify, relate, list, define, recall, memorize, repeat, record, name, recognize, acquire
  • 14. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 2. Comprehension: The ability to grasp or construct meaning from material. Examples of verbs: restate, locate, report, recognize, explain, express, identify, discuss, describe, review, infer, conclude, illustrate, interpret, draw, represent, differentiate 2. Understanding: Constructing meaning from different types of functions be they written or graphic message activities like interpreting, exemplifying, classifying, summarizing, inferring, comparing and explaining. Sample verbs: restate, locate, report, recognize, explain, express, identify, discuss, describe, review, infer, conclude, illustrate, interpret, draw, represent, differentiate
  • 15. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 3. Application: The ability to use learned material, or to implement material in new and concrete situations. Examples of verbs: apply, relate, develop, translate, use, operate, organize, employ, restructure, interpret, demonstrate, illustrate, practice, calculate, show, exhibit, dramatize 3. Applying: Carrying out or using a procedure through executing, or implementing. Applying relates and refers to situations where learned material is used through products like models, presentations, interviews or simulations. Sample verbs: apply, relate, develop, translate, use, operate, organize, employ, restructure, interpret, demonstrate, illustrate, practice, calculate, show, exhibit, dramatize
  • 16. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 4. Analysis: The ability to break down or distinguish the parts of the material into their components so that their organizational structure may be better understood. Examples of verbs: analyze, compare, probe, inquire, examine, contrast, categorize, differentiate, investigate, detect, survey, classify, deduce, experiment, scrutinize, discover, inspect, dissect, discriminate, separate 4. Analyzing: Breaking material or concepts into parts, determining how the parts relate or interrelate to one another or to an overall structure or purpose. Mental actions include in this function are differentiating, organizing and attributing, as well as being able to distinguish between the components or parts. When one is analyzing, he/she can illustrate this mental function by creating spreadsheets, surveys, charts, or diagrams, or graphic representations.
  • 17. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 5. Synthesis: The ability to put parts together to form a coherent or unique new whole. Examples of verbs: produce, design, assemble, create, prepare, predict, modify, plan, invent, formulate, collect, set up, generalize, document, combine, propose, develop, arrange, construct, organize, originate, derive, write 5. Evaluating: Making judgments based on criteria and standards through checking and critiquing. Critiques, recommendations, and reports are some of the products that can be created to demonstrate the processes of evaluation. Sample verbs: appraise, choose, compare, conclude, decide, defend, evaluate, give your opinion, judge, justify, prioritize, rank, rate, select, support, value
  • 18. Bloom’s Taxonomy in 1956 Anderson/Krathwolh’s Revision in 2001 6. Evaluation: The ability to judge, check, and even critique the value of material for a given purpose. Examples of verbs: judge, assess, compare, evaluate, conclude, measure, deduce, argue, decide, choose, rate, select, estimate, validate, consider, appraise, value, criticize, infer 6. Creating: Putting elements together to form a coherent or functional whole; reorganizing elements into a new pattern or structure through generating, planning, or producing. Creating requires users to put parts together in a new way or synthesize parts into something new and different form or product. This process is the most difficult mental function in the new taxonomy. Sample verbs – change, combine, compose, construct, create, invent, design, formulate, generate, produce, revise, reconstruct, rearrange, visualize, write, plan
  • 19.
  • 20. Table of Specification Learning Competency Number of Days Number of Items Item Placement Cognitive Level Remembering Understanding (60%) Easy Applying (30%) Average Analyzing Evaluating Creating (10%) Difficult Basic Concepts of Fractions 1 5 1-5 Addition of Fractions 1 5 6-10 Subtraction of Fractions 1 5 11-15 Multiplication and Division of Fractions 3 15 16-30 31-40 Application/ Problem Solving 4 20 41-45 46-50 Total 10 50 30 15 5
  • 21. How to Determine the No. of Items? Formula: No. of items = 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 𝑥 𝑑𝑒𝑠𝑖𝑟𝑒𝑑 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑖𝑡𝑒𝑚𝑠 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑑𝑎𝑦𝑠 Example: Learning Competency: Multiplication and Division of Fractions Number of days: 3 Desired no. of items: 50 Total no. of class sessions: 10 No. of items = 3 𝑥 50 10 = 15
  • 22. Check your understanding: Directions: Complete the table by supplying the no. of items for each learning competency. Learning Competency No. of Class Sessions No. of Items Musculo-Skeletal System 2 Integumentary System 2 Digestive System 3 Respiratory System 3 Circulatory System 4 Total 14
  • 23. Workshop – Making Table of Specification Using your Curriculum Guide, make a table of specification for Periodic Test per subject area in each quarter.
  • 24.
  • 26. 1. Identify the different rules in constructing multiple choice test 2. Construct multiple choice test Objectives:
  • 27.
  • 28. Research indicates . . . • Teachers tend to use tests that they have prepared themselves much more often than any other type of test. (How Teaching Matters, National Council for Accreditation of Teacher Education, Oct. 2000) • While assessment options are diverse, most classroom educators rely on text and curriculum-embedded questions, and tests that are overwhelmingly classified as paper-and-pencil (National Commission on Teaching and America’s Future, 1996)
  • 29. Research indicates . . . • Formal training in paper-and-pencil test construction may occur at the preservice level (52% of the time) or as in-service preparation (21%). A significant number of professional educators (48%) report no formal training in developing, administering, scoring, and interpreting tests (Education Week, “National Survey of Public School Teachers, 2000). • Students report a higher level of test anxiety over teacher-made tests (64%) than over standardized tests (30%). The top three reasons why: poor test construction, irrelevant or obscure material coverage, and unclear directions. (NCATE, “Summary Data on Teacher Effectiveness, Teacher Quality, and Teacher Qualifications”, 2001)
  • 30. Two General Categories of Test Items 1. Objective items which require students to select the correct response from several alternatives or to supply a word or short phrase to answer a question or complete a statement. Objective items include: multiple choice, true-false, matching, completion 2. Subjective or essay items which permit the student to organize and present an original answer. Subjective items include: short-answer essay, extended-response essay, problem solving, performance test items
  • 31. Creating a test is one of the most challenging tasks confronting a teacher. Unfortunately, many of us have had little, if any, preparation in writing tests.
  • 32. What makes a test good or bad? The most basic and obvious answer to that question is that good tests measure what you want to measure, and bad tests do not.
  • 33. When to use objective tests? Objective tests are appropriate when: The group to be tested is large and the test may be reused. Highly reliable scores must be obtained as efficiently as possible. Impartiality of evaluation, fairness, and freedom from possible test scoring influences are essential.
  • 34. When to use objective tests? Objective tests can be used to: Measure almost any important educational achievement a written test can measure. Test understanding and ability to apply principles. Test ability to think critically. Test ability to solve problems
  • 35. The matching of learning objective expectations with certain item types provides a high degree of test validity: testing what is supposed to be tested.
  • 36. Matching Learning Objectives with Test Items Directions: Below are four test items categories labeled A, B, C, and D. Match the learning objectives with the most appropriate test item category. A – Objective test Item (MC, true-false, matching) B – Performance Test Item C – Essay Test Item (extended response) D - Essay Test Item (short answer) 1. Name the parts of the human skeleton. Answer: A 2. Appraise a composition on the basis of its organization. Answer: C 3. Demonstrate safe laboratory skills. Answer: B
  • 37. Matching Learning Objectives with Test Items Directions: Below are four test items categories labeled A, B, C, and D. Match the learning objectives with the most appropriate test item category. A – Objective test Item (MC, true-false, matching) B – Performance Test Item C – Essay Test Item (extended response) D - Essay Test Item (short answer) 4. Cite four examples of satire that Twain uses in Huckleberry Finn. Answer: D 5. Design a logo for a web page. Answer: B 6. Describe the impact of a bull market. Answer: C 7. Diagnose a physical ailment. Answer: B
  • 38. Matching Learning Objectives with Test Items Directions: Below are four test items categories labeled A, B, C, and D. Match the learning objectives with the most appropriate test item category. A – Objective test Item (MC, true-false, matching) B – Performance Test Item C – Essay Test Item (extended response) D - Essay Test Item (short answer) 8. List important mental attributes necessary for an athlete. Answer: D 9. Categorize great American fiction writers. Answer: A 10. Analyze the major causes of learning disabilities. Answer: C
  • 39. In general, test items should . . . •Assess achievement of instructional objectives •Measure important aspects of the subject (concepts and conceptual relations) •Accurately reflect the emphasis placed on important aspects of instruction •Measure an appropriate level of student knowledge •Vary in levels of difficulty
  • 40. Technical Quality of a Test 1. Cognitive Complexity The test questions will focus on appropriate intellectual activity ranging from simple recall of facts to problem solving, critical thinking, and reasoning. 2. Content Quality The test questions will permit students to demonstrate their knowledge of challenging and important subject matter. 3. Meaningfulness The test questions will be worth students’ time and students will recognize and understand their value.
  • 41. Technical Quality of a Test 4. Language Appropriateness The language demands will be clear and appropriate to the assessment tasks and to students. 5. Transfer and Generalizability Successful performance on the test will allow valid generalizations about achievement to be made.
  • 42. Technical Quality of a Test 6. Fairness Student performance will be measured in a way that does not give advantage to factors irrelevant to school learning; scoring schemes will be similarly equitable. 7. Reliability Answers to test questions will be consistently trusted to represent what students know.
  • 43.
  • 44. Activity: Piliin Mo Ako! Directions: Choose the letter of the best answer.
  • 45. Question 1: Multiple choice items provide highly reliable test scores because: A. They do not place a high degree of dependence on the students reading ability B. They place high degree of dependence on a teacher’s writing ability C. They are subjective measurement of student achievement D. They allow a wide sampling of content and a reduce guessing factor Answer: D
  • 46. Question 2: You should: A. Always decide on an answer before reading the alternatives B. Always review your marked exams C. Never change an answer D. Always do the multiple choice items on an exam first Answer: B
  • 47. Question 3: The multiple choice item on the right is structurally undesirable because: A. A direct question is more desirable than an incomplete statement B. There is no explicit problem of information in the stem C. The alternatives are not all plausible D. All of the above E. A & B only F. B & C only G. A & C only H. None of the above Answer: D You should: A. Always decide on an answer before reading the alternatives B. Always review your marked exams C. Never change an answer D. Always do the multiple choice items on an exam first
  • 48. Question 4: Question 3 multiple choice item on the right is undesirable because: A. It relies on an answer required in a previous item B. The stem does not supply enough information C. Eight alternatives are too many and too confusing to the students D. More alternatives just encourage guessing Answer: C Question 3: The multiple choice item on the right is structurally undesirable because: A. A direct question is more desirable than an incomplete statement B. There is no explicit problem of information in the stem C. The alternatives are not all plausible D. All of the above E. A & B only F. B & C only G. A & C only H. None of the above
  • 49. Question 5 The right answers in multiple choice questions tend to be: A. Longer and more descriptive B. The same length as the wrong answer C. At least a paragraph long D. Short Answer: A
  • 50. Question 6 When guessing on a multiple choice question with numbers in the answer A. Always pick the most extreme B. Pick the lowest range C. Pick answers in the middle range D. Always pick C Answer: C
  • 51. Question 7: What is the process of elimination in a multiple choice question? A. Skipping the entire question B. Eliminating all answers with extreme modifiers C. Just guessing D. Eliminating the wrong answers Answer: D
  • 52. Question 8 It is unlikely that a student who is unskilled in untangling negative statements will: A. Quickly understand multiple choice items not written in this way B. Not quickly understand multiple choice items not written in this way C. Quickly understand multiple choice items written in this way D. Not quickly understand multiple choice items written in this way Answer: C
  • 53. Multiple Choice Test Items MC item consist of the stem, which identifies the question or problem and the response alternatives or choices. Usually, students are asked to select the one alternative that best completes a statement or answer a question. Item Stem: Which of the following is a chemical change? Response Alternatives: A. Evaporation of alcohol B. Freezing of water C. Burning of oil D. Melting of wax
  • 54. General Guidelines in Constructing MC Test 1. Make the test item that is practical or with real-world applications to the students. 2. Use diagrams or drawings when asking questions about application, analysis or evaluation. 3. When ask to interpret or evaluate about quotations, present actual quotations from secondary sources like published books or newspapers. 4. Use tables, figures, or charts when asking questions to interpret. 5. Use pictures if possible when students are required to apply concepts and principles.
  • 55. General Guidelines in Constructing MC Test 6. List the choices/options vertically not horizontally. 7. Avoid trivial questions. 8. Use only one correct answer or best answer format. 9. Use three to five options to discourage guessing. 10. Be sure that distracters are plausible and effective. 11. Increase the similarity of the options to increase the difficulty of the item. 12. Do not use “none of the above” options when asking for a best answer. 13. Avoid using “all of the above” options. It is usually the correct answer and makes the item too easy for the examinee with partial knowledge.
  • 56. Guidelines in Constructing the Stem 1. The stem should be written in question form or completion form. Research showed that it is more advisable to use question form. 2. Do not leave the blank at the beginning or at the middle of the stem when using completion form of multiple-choice type of test. 3. The stem should pose the problem completely. 4. The stem should be clear and concise. 5. Avoid excessive and meaningless use of words in the stem. 6. State the stem in positive form. Avoid using the negative phrase like “not” or “except.” Underline or capitalize the negative words if it can be avoided. Example: Which of the following does not belong to the group? or Which of the following does NOT belong to the group? 7. Avoid grammatical clues in the correct answer.
  • 57. Guidelines in Constructing Options 1. There should be one correct or best answer in each item. 2. List options in vertical order not horizontal order beneath the stem. 3. Arrange the options in logical order and use capital letters to indicate each option such as A, B, C, D, E. 4. No overlapping options; keep it independent. 5. All options must be homogenous in content to increase the difficulty of an item. 6. As much as possible the length of the options must be the same or equal. 7. Avoid using the phrase “all of the above.” 8. Avoid using the phrase “none of the above” or “I don’t know.”
  • 58. Guidelines in Constructing the Distracters 1. The distracters should be plausible. 2. The distracters should be equally popular to all examinees. 3. Avoid using ineffective distracters. Replace distracter(s) that are not effective to the examinees. 4. Each distracter should be chosen by at least 5% but not more than the key answer. 5. Revise distracter(s) that are over attractive to the teachers. They might be ambiguous to the examinees.
  • 59. Advantages of MC Test 1. Measures learning outcomes from the knowledge to evaluation level. 2. Scoring is highly objective, easy and reliable. 3. Scores are more reliable than subjective type of test. 4. Measures broad samples of content within a short span of time. 5. Distracters can provide diagnostic information. 6. Item analysis can reveal the difficulty of an item and can discriminate the good and performing students.
  • 60. Disadvantages of MC Test 1. Time consuming to construct a good item. 2. Difficult to find effective and plausible distracters. 3. Scores can be influenced by the reading ability of the examinees. 4. In some cases, there is more than one justifiable correct answer. 5. Ineffectiveness in assessing the problem solving skills of the students. 6. Not applicable when assessing the students’ ability to organize and express ideas.
  • 61. Activity: Improve Mo Ako! Directions: The following multiple choice questions are poorly constructed. Write a better version of the question.
  • 68. Item 7 Poor Item Better Item
  • 69. Item 8 Poor Item Better Item
  • 70. Item 9 Poor Item Better Item
  • 71. Item 10 Poor Item Better Item
  • 72. “Understand that there is always one clearly best answer. Your goal is not to trick students or require them to make difficult judgments about two options that are nearly equally correct. Your goal is to design questions that students who understand will answer correctly and students who do not understands will answer incorrectly.” John A. Johnson Dept. of Psychology Penn State University
  • 73. POINTS TO PONDER. . . A good lesson makes a good question A good question makes a good content A good content makes a good test A good test makes a good grade A good grade makes a good student A good student makes a good COMMUNITY Jesus Ochave Ph.D. VP Research Planning and Development Philippine Normal University