2. Assessing Higher Order Thinking
Skills through Multiple Choice Tests
•The following are suggestions
in using multiple choice tests
to assess HOTS.
1.Create questions that
require:
•Synthesis of a larger amount
of material.
3. Assessing Higher Order Thinking
Skills through Multiple Choice Tests
•Analysis of a
situation/problem.
•Application of understanding
of the content.
•Longer amount of time to
answer (3-5 min. each)
4. 2. Design questions that
require the students to apply
knowledge, principles, laws,
or calculate results before
selecting answers.
3. Design questions that
require students to synthesize
information from more than
one unit or aspect of the topic
in order to answer the
5. question.
4. Present a case situation
that supplies all the “clues”
needed to analyze the
situation.
6. Example:
• The criterion of success in Teacher Apol’s objective is
that “the student must be able to solve 75% of the
worded problem correctly”. Ritz and 29 others in the
class answered only 18 out of 25 items correctly. This
means that teacher Apol________.
a. attained the lesson objective because of his effective
problem solving drills
b. did not attain his lesson objective his students lack of
attention
c. attained his lesson objective
d. did not attained the lesson objective as far as the 30
students are concerned
7. •The answer to the given
example is D - because only 72%
of the given items were
answered correctly by the
students.
•This type of question that
requires more than
memorization to be answered
correctly is what we refer to as
a HOTS question.
8. •In writing multiple-choice items
that demand higher order
thinking skills, four patterns can
be considered. These are the
following:
1.Premise – Consequence
•Students must identify the
correct outcome of a given
circumstance.
9. 2. Analogy
•Students must map the
relationship between two items
in a different context.
Example: USA:Democratic;
China:________
a. Aristocracy c. Communist
b. Autocracy d. Monarchy
10. 3. Case Study
•A single, well-written
paragraph can provide
material for several follow-up
questions.
11. • Example: Angelica, Kyla, and Danica own a small
business: the T-shirt Company, printing personalized
designs. Because Angelica has many outside
commitments and Kyla also has a few, Danica tends
to be most in touch with the daily operations of T-
shirt Company. As a result, when financial decisions
come down to a vote at their monthly meeting, they
have decided that Danica gets 8 votes, Kyla gets 7,
and Angelica gets 2 with 9 being required to make
decisions. According to minimum-resource coalition
theory, who is most likely to be courted for their vote?
a. Kyla
b. Angelica
c. Danica
d. No trend toward any specific person
12. 4. Incomplete Scenario
• Students must respond to what is missing or
needs to be changed within a provided
scenario.
• Note: when using a graph or image, try to lay
it out differently than how the students have
seen it. This is equivalent to using new
language to present a familiar concept and
prevent students from using rote
memorization to answer the question. For
example, the diagram below may originally
have been laid out as a series and may not
be as detailed as the diagram they saw in
the book.
13. Matching Type Test Items
•This type of test requires
students or the examinees to
match each word, number, or
symbol in one column to a word,
sentence or phrase in the other
column. The following are
suggested guidelines in writing
matching type of test items:
14. 1.There should be homogeneity in the
materials used in a single matching
exercise. For example, if the test is
about the tools used in consumer
electronics troubleshooting, the
responses should all be tools and the
premises should all be pertaining to
their description.
2.There should be a short blank before
each premise for the symbol (letter,
number, etc.) of the response to which
the premise refers.
15. 3. There should be clear and
concise directions stating the
basis for the association.
4. The responses should be
listed in logical order.
16. 5. Items to be matched should be brief, and
the shorter responses should be on the
right column. Approximately four to seven
items in each column seems best while
others suggests approximately ten items. In
choosing the number of items, note that as
the number of items decreases, the
difficulty might suffer because pairing
would be relatively easy. On the other hand,
tests with more than ten items would result
to laborious pairing for the examinees.
6. Always include more responses than
premises or questions.
17. Content Validation
Content validation for teacher made tests
is done to check for accumulated unwanted
errors that might distort the item's
function, adversely affecting the
attainment of the objectives. These
'loopholes' can be identified by:
•Reviewing the test items yourself after a
few days of setting it aside
• Asking a group of fellow teachers to
review the test
18. •The following are some of the aspects
to focus on when reviewing the test
items:
1. Appropriateness of the test format for
the learning outcome being measured
•Review the behavioral verbs in the
objectives and see if the test format is
really suited for the learning outcome
being measured. If not, revise the
corresponding test items to suit the
original purpose.
19. 2. Clarity of the test item
Review for possible unclear,
lengthy, awkward, and
inappropriate statements or
words that might have been
overlooked. Reviewing the test
items after setting them aside for
a few days facilitates a fresh
point of view which helps in
noticing such defects.
20. 3. Technical and irrelevant clues
Review for grammatical inconsistencies,
verbal associations, and specific
determiners (e.g. always, never, etc).
4. Racial, ethnic, and sexual bias
•Review each item's sensitivity to
members of all groups. Make sure that
the vocabulary and situations used are
acceptable and would have the same
meaning across the members of the
class or the individuals concerned.
21. 5. Make sure that the test items is
properly arranged
The general guideline in arranging
items in the test is as follows:
•Items should be arranged in sections
by item type (e.g. multiple items
should be grouped and so as the other
types of test items).
•When two or more test item types are
included, it is suggested to follow the
sequence below: