SlideShare a Scribd company logo
1 of 117
Fire and Emergency Services
Instructor
9th Edition
Chapter 13 — Test Item Construction
‣ Describe common considerations for
test instruments.
Learning Objective 1
Common Considerations for All Tests
Test items must always be based on specific learning
objectives; Level II Instructors must consider
‣ Test formatting and item arrangement
‣ Test item level of cognition and difficulty
‣ Test instructions and time requirements
‣ Testing bias
Kirkpatrick’s Four Levels of Evaluation
Test Formatting and Item Arrangement
‣ Provide space for students to write their name and the
date on either the test sheet or a separate answer
sheet
‣ Provide a title or label at the top of the first page
‣ Number all tests and label different versions of the test
— This will help with score reporting and test security
‣ Number all pages of the test — Students will be able to
budget their time more wisely if they can see the
length of the test
Test Formatting and Item Arrangement
‣ Provide clear instructions at the beginning of the test
and at the beginning of each section that uses a
different type of test item (such as multiple-choice,
matching, true-false, or fill-in-the-blank)
‣ Provide a sample test item, along with a sample
answer, to show students how to respond to each
item
‣ Number all items consecutively
‣ Single-space each test item, but double-space
between items
Test Formatting and Item Arrangement
Test Formatting and Item Arrangement
‣ State the point value of each test item, for example
‣ Multiple-choice: 1 point each
‣ Short-answer: 2 points each
‣ Use commonly understood terms
‣ For example, do not use abbreviations unless they
are placed in parentheses following the common
term
Test Formatting and Item Arrangement
‣ Test items must be arranged in a logical sequence, and
can be grouped into two categories
‣ Learning domain outcome
‣ Knowledge
‣ Comprehension
‣ Application
‣ Type of test item
‣ Multiple-choice
‣ Matching
‣ Short-answer
Test Formatting and Item Arrangement
‣ Instructors must make sure that the wording of one
test item does not reveal the answer to another test
item
‣ On computer-adaptive tests, such as those used in
EMS, sequencing may be organized so that students
can only progress to a more difficult question after
correctly answering a simpler one
Test Item Level of Cognition and
Difficulty
Test Item Level of Cognition and
Difficulty
‣ Test items should evaluate the student’s ability at the
level within the taxonomy that corresponds to the
learning objective being evaluated; a variety of levels
can exist within a course
‣ The actual determination of test difficulty is the
responsibility of a Level III Instructor
‣ The instructor compiles scored tests, and evaluates
students’ performance on individual questions
‣ The difficulty level of each question is then saved
for future testing use and evaluation
Test Instructions
‣ Purpose of the test
‣ Method and means for recording answers
‣ Recommendation whether to guess when undecided
on an answer (in some cases, incorrect answers are
penalized more than not answering the question)
‣ Amount of time available to complete the test
Time Requirements for Answering
Certain Types of Questions
Time Requirements for Answering
Certain Types of Questions
Time Requirements for Answering
Certain Types of Questions
‣ Instructors can use these estimates to calculate how
much time students will need to complete the test
‣ May take the test, or ask another professional
member of the organization to take the test, to see
how much time is needed
‣ Tests should be an appropriate length to address
the learning objectives that the test is intended to
evaluate
Time Requirements for Answering
Certain Types of Questions
‣ When time is a restrictive factor, tests can emphasize
the most critical learning objectives and include a
sampling of less important objectives; this method of
test construction is called sampling
‣ The plan must be documented to prove that necessary
components were addressed
‣ The most critical objectives must be tested in each
version of the test while the less critical objectives
tested are included in a certain rotation
Testing Bias
‣ Test items and testing instruments should not favor or
penalize any particular group of students
‣ Ensuring that test questions very closely reflect the
materials being tested is the best way to avoid bias in
testing materials
‣ When students recognize that test items closely
resemble the information that they have studied, they
are more likely to perform confidently on tests
regardless of their gender, cultural, ethnic, or regional
backgrounds
Testing Bias
‣ In the fire and emergency services, bias is generally
limited to use of regional jargon and differences in
terminology
‣ Example: local governments in the U.S. and
Canada may be referred to as
‣ Counties or parishes
‣ Jurisdictions in legal terms
Testing Bias
‣ Similarly, some fire apparatus may be referred to as
a tanker or a tender depending on geographical
region or the differences between departments
‣ The terminology on the test should reflect the
terminology of the students and the materials from
which they studied or received training
‣ Discuss various types of evaluation
instruments used in fire and emergency
service training.
Learning Objective 2
‣ Deciding how to evaluate students can be a confusing
and difficult topic for the Level II Instructor
‣ Instructor needs to establish the appropriate evaluation
instrument to meet the parameters of the curriculum
and course being presented
‣ In addition, once the instructor determines the type of
instrument to use, they must then design the
instrument to be fair and unbiased
‣ This process takes time, and new Level II Instructors
may often underestimate the time commitment
Student Evaluation Instruments
Student Evaluation Instruments
Written Tests
Written Tests: Objective
‣ An objective test item is a question for which there is
only one correct answer
‣ Judgment of the instructor or evaluator is not
relevant and has no effect on assessment
‣ Objective items measure cognitive learning, but
typically only at the lower levels of remembering and
understanding
Written Tests: Objective
‣ Properly constructed objective test items can also be
used to measure higher levels of cognitive learning
such as evaluation or creation
‣ Three main types of objective questions
‣ Multiple-choice
‣ True or false
‣ Matching
Written Tests: Subjective
‣ A subjective test item has no single correct answer;
the evaluator’s judgment may therefore affect
assessment
‣ Subjective items are an effective way of measuring
higher cognitive levels
‣ They allow students the freedom to organize,
analyze, revise, redesign, or evaluate a problem
Written Tests: Subjective
‣ The strength of a student’s response to these items
depends on a variety of factors, such as
‣ How well they communicate their ideas
‣ Personal opinions of the evaluator
Written Tests: Subjective
‣ There are three main types of subjective test items
‣ Short-answer or completion
‣ Essay
‣ Interpretive exercise
Written Tests: Multiple Choice
A multiple-choice test item consists of either a
question or an incomplete statement, called the stem,
plus a list of several possible responses, which are
referred to as choices or alternatives
Written Tests: Multiple Choice
‣ Students are tasked to read the stem and select the
correct response from the list of alternatives
‣ The correct choice is known as the answer and the
remaining choices are called distractors
‣ Distractors are used to discriminate between students
who understand the subject matter well and those
who are uncertain of the correct answer
‣ Distractors are not meant to trick, confuse, or mislead
students
‣ Write the stem in the form of a direct question or an
incomplete sentence that measures only one learning
objective
‣ Write a clear, brief stem that contains most of the
wording for the test item
‣ This helps to avoid placing repeated words in the
alternatives
‣ Write positive questions as much as possible; be
consistent in labeling negative words if and when
negative statements are used
Written Tests: Multiple Choice
Guidelines
‣ Provide at least three plausible, attractive distractors
‣ Phrase the choices so that they are parallel and
grammatically consistent with the stem
‣ Place correct answers in varied positions among the
A, B, C, and D choices
‣ Place each choice on a separate, indented line, and
in a single column
‣ Begin responses with capital letters when the stem is
a complete question
Written Tests: Multiple Choice
Guidelines
‣ Begin responses with lowercase letters when the
stem is an incomplete sentence
‣ Do not include choices that are obviously wrong or
intended to be humorous
‣ Make sure that stems and alternatives do not give
students grammatical clues as to the correct
response
‣ Make all alternatives close to the same length
Written Tests: Multiple Choice
Guidelines
‣ Avoid using the phrases
‣ All of the above
‣ None of the above
‣ Do not test trivial ideas or information
‣ Use correct grammar and punctuation
Written Tests: Multiple Choice
Guidelines
Written Tests: Multiple Choice
Disadvantages
‣ They are not well suited to measuring certain
cognitive skills, such as organizing and presenting
ideas; essay tests are more effective for this purpose
‣ Depending on the test writer’s skill, this type of test
may not include different difficulty-level test items
that measure a variety of cognitive learning levels
‣ Creating appropriate and plausible distractors for
each stem can require significant time and thought
‣ Students who do not know the material may still be
able to guess the correct answer
‣ The true-false test item is
a single statement that
the student must
determine to be either
true or false
‣ It is difficult to construct
a statement that is
completely true or
completely false
Written Tests: True-False
Written Tests: True-False
‣ True statements should be based on facts
‣ False statements should be based on common
misconceptions of the facts
‣ In addition to the traditional true-false test items,
there are also modified true-false test items
‣ Modified true-false items ask the student to explain
why an item is false or to rewrite the item to make it
true
Written Tests: True-False
‣ One limitation of true-false questions
‣ Students tend to remember the false items on the
test as being true, known as the negative
suggestion effect
‣ Instructors should review the correct answers to
true-false questions with students after scoring the
test to help combat this effect
Written Tests: True-False Guidelines
‣ Write the words True and False at the left margin if
students must mark their answers on the test paper
‣ On computer-scored answer sheets
‣ True may be assigned to A
‣ False is assigned to B
‣ Provide clear instructions so that students know how
to respond to each statement
Written Tests: True-False Guidelines
‣ Create enough test items to provide reliable results
‣ For reliability purposes, more true/false items are
needed than the number used for multiple-choice
items
‣ A large number of test items minimizes the
possibility of guessing the correct answers
‣ Distribute true and false items randomly
Written Tests: True-False Guidelines
‣ Avoid determiners (words that indicate a specific
answer) that provide unwarranted clues
‣ Words such as usually, generally, often, or
sometimes are most likely to appear in true
statements
‣ The words never, all, always, or none are more
likely to be found in false statements
‣ Avoid creating items that could trick or mislead
students into making a mistake
Written Tests: True-False Guidelines
‣ Ensure only one correct answer is possible
‣ Avoid double-negative test items; they are very
confusing to students and do not accurately measure
knowledge
‣ Avoid using personal pronouns such as "you“
‣ Do not use test items that test trivia or obscure facts
‣ Develop test items that require students to think
about what they have learned, rather than merely
remember it
Written Tests: True-False Guidelines
‣ Avoid unusually long or short test items, because the
length may be a clue; true items are often longer
than false items, because they include a justification
‣ Create brief, simply stated test items that deal with a
single concept; avoid lengthy, complex items that
address more than one concept
‣ Avoid quoting information directly from the textbook
Written Tests: Matching
‣ Matching test items
consist of two parallel
columns of words,
phrases, images, or a
combination of these
‣ In the most common
example, students must
match a word from the
left column with its
definition from the right
column
Written Tests: Matching
The content of a
matching test item
must consist of similar
material, items, or
information
Written Tests: Matching
Written Tests: Matching Guidelines
‣ Avoid placing each group of prompts and the list of
responses on more than one page
‣ Separate matching sections into sets of five problems
and responses when using computer or mechanically
scored answer sheets
‣ Consider preparing one more response than there are
prompts; the extra response requires more precise
knowledge and prevents students from finding an
answer by eliminating all the other possible answers
Written Tests: Matching Guidelines
‣ Number the problem statements; place an answer line to
the left of each number unless a separate answer sheet is
used
‣ Use letters for each response
‣ Arrange problem statements and responses into two
columns
‣ Problem statements on the left side of the page
‣ Responses on the right
‣ Columns may be titled with appropriate headings, such as
Tools and Uses, or Symptoms and Treatments
Written Tests: Matching
NOTE
‣ Instructors should be advised that
matching test items may be more
effectively and efficiently written as a
series of multiple-choice questions.
Written Tests:
Short-answer/Completion
‣ A short-answer item is a
question for which students
must write a correct response
‣ To do so, they must recall
previously learned
information, apply relevant
principles, or understand
methods or procedures
‣ Short-answer items are often
subjective
Written Tests:
Short-answer/Completion
‣ A completion item should be objective
‣ This type of test item is a statement in which key
words are replaced with an underlined blank space
that students are tasked to fill in
Written Tests:
Short-answer/Completion
Written Tests:
Short-answer/Completion Guidelines
‣ On completion test items, create short, direct
statements for which only one answer is possible
‣ Avoid long statements with a string of blanks to fill
‣ Start with a direct question and change it to an
incomplete statement
‣ Make sure that the desired response is a key point in
the lesson
‣ Arrange the statement with the blanks at or near the
end of the sentence
Written Tests:
Short-answer/Completion Guidelines
‣ Avoid statements that call for answers with more than
one word, phrase, or number
‣ Eliminate unnecessary clues, such as answer blanks that
vary in length or the use of the words "a" or "an"
preceding the blank
‣ Write a rubric or detailed answer sheet so that the
scorer understands the full extent of possible,
acceptable answers to the questions
Written Tests: Essay
‣ Like short-answer test
items, essays are
subjective
‣ Students must construct
an in-depth answer on a
topic or question related
to a key aspect of the
course material
Written Tests: Essay
‣ The strength of this item type is that it tests the students’
higher level cognitive processes
‣ Students are expected to demonstrate the ability to analyze
a topic, create a solution to a problem, or evaluate a
system or process
‣ Essay tests eliminate guessing, because students must
know the material thoroughly in order to write an effective
essay
‣ Creative students often prefer this type of test item
because it allows them a forum to express their perspective
of a topic
Written Tests: Essay Disadvantages
‣ Essays are time-consuming for students to complete and
instructors to score
‣ Differences in students’ writing ability, penmanship,
spelling, and grammar may affect an instructor’s ability
to easily score the test
‣ Students who have difficulty writing or write slowly will
be at a disadvantage, especially in a timed test
Written Tests: Essay Guidelines
‣ Choose essay topics that reflect key aspects of the
course material
‣ Create a rubric that establishes clear scoring guidelines
‣ For each essay question, provide clear instructions that
define how students should respond, how much time
they should spend responding and how many pages or
paragraphs each response should be
‣ Provide sufficient time for students to respond to all
questions
Written Tests: Interpretive Exercises
‣ The interpretive exercise is another subjective test
item that measures higher level cognitive processes
‣ An exercise consists of introductory material, typically
numerical data, a graph, or a paragraph of text,
followed by a series of test items
‣ Students read the text or look at the illustrations,
then answer the test items, which may be any of the
types described in this chapter
Written Tests: Interpretive Exercises
Rules
‣ Make sure that all introductory material relates to key
learning objectives, and is as concise as possible
‣ Apply relevant guidelines for effective item
construction for each test item
‣ Use test items that require the same type of
performance that is listed in the test specifications
for the various learning objectives
‣ Create original introductory material unfamiliar to
students
Written Tests: Interpretive Exercises
Rules
‣ Ensure that the introductory material does not give
away the answer to any of the test items
‣ Encourage students to read the introductory material
to be able to answer test items
‣ Provide enough test items, using a variety of item
types, to effectively measure students’ understanding
of the material
Oral Tests
Open or closed questions
‣ When the purpose of the test is to determine
knowledge, the questions should be closed, requiring
only a single brief answer
‣ When the purpose is to determine how a student
responds under pressure, the question should judge
both accuracy and presentation; In this case, the
questions should be open, permitting longer answers
that may lead to further questions
Oral Tests
‣ Oral tests can be very stressful for students
‣ Instructors should provide a relaxed,
comfortable atmosphere for the presentation
of oral tests
Oral Tests: Important Aspects of
Designing and Conducting Oral Tests
Oral Tests
‣ Oral tests are highly subjective, especially when the
questions may be answered a number of ways
‣ To reduce evaluator bias, test developers should
provide a scoring rubric that lists all possible correct
answers
‣ An oral test is the most valid and reliable way to test a
student’s ability to verbally communicate ideas,
concepts, and processes; may be the best measure of a
student’s judgment and thought processes
Performance (Skills) Tests
Performance (Skills) Tests
Performance (Skills) Tests
‣ Assessment is based on either a speed standard such
as timed performance, a quality standard such as
minimum acceptable performance, or both
‣ Performance tests require students to demonstrate
psychomotor proficiency after appropriate practice or
drill sessions
‣ Tests must take place under controlled conditions so
instructors can make reliable, valid judgments about
student performance
Performance (Skills) Tests:
Guidelines
‣ Specify performance objectives to be measured
‣ Select rating factors on which the test will be judged
‣ Provide written instructions that clearly explain the
test situation
‣ Confirm a new performance test with other
instructors or previous students before administering
it to students
Performance (Skills) Tests:
Guidelines
‣ Use more than one test evaluator
‣ Follow established procedures when administering
the test
‣ Make a score distribution chart after tests have been
administered and graded
‣ Rotate team members to every position for team
evaluation ratings
Performance (Skills) Tests:
Advantages
Performance (Skills) Tests
Courtesy
of
California
State
Fire
Training
and
the
Oakland
Fire
Department
Performance (Skills) Tests:
Advantages
Performance (Skills) Tests:
Advantages
‣Explain the steps for test planning.
Learning Objective 3
Test Planning
Determining Test Purpose and
Classification
Determining Test Purpose and
Classification
Determining Test Purpose and
Classification
Determining Test Purpose and
Classification
Is designed to
‣ Determine readiness for instruction or placement in
the appropriate instructional level (prescriptive or
placement test)
‣ Measure improved progress or identify learning
problems that are hampering progress (formative
or progress test)
‣ Rate terminal performance (summative or
comprehensive test)
Determining Test Purpose and
Classification: Planning Considerations
‣ Whether the test measures technical knowledge
retention and recall in the cognitive domain (written
or oral tests)
‣ Measures manipulative skills in the psychomotor
domain (performance or skill tests)
‣ Measures behavioral changes in attitude, values, or
beliefs in the affective domain (written or oral tests)
Determining Test Purpose and
Classification: Planning Considerations
Identifying Learning Objectives
Identifying Learning Objectives
Constructing Appropriate Test Items
Constructing Appropriate Test Items
Selecting Proper Level of Test Item
Difficulty
Selecting Proper Level of Test Item
Difficulty
Selecting Proper Level of Test Item
Difficulty
Selecting Proper Level of Test Item
Difficulty
Determining the Appropriate Number
of Test Items
Determining the Appropriate Number
of Test Items
Determining the Appropriate Number
of Test Items
Eliminating Language and
Comprehension Barriers
Eliminating Language and
Comprehension Barriers
Avoiding Giving Clues to Test
Answers
Avoiding Giving Clues to Test
Answers
Avoiding Giving Clues to Test
Answers
Ensuring Test Usability
Ensuring Test Usability
Ensuring Test Usability
Ensuring Validity and Reliability
Ensuring Validity and Reliability
Ensuring Validity and Reliability
Ensuring Validity and Reliability:
Steps
Ensuring Validity and Reliability:
Steps
Ensuring Validity and Reliability
Ensuring Validity and Reliability
‣ Describe the process to select a test
scoring method.
Learning Objective 4
Test Scoring Method Selection
Test Scoring Method Selection:
Written Tests
Test Scoring Method Selection:
Oral Tests
Test Scoring Method Selection:
Performance or Skill Tests
Test Scoring Method Selection:
Performance or Skill Tests
Test Scoring Method Selection

More Related Content

Similar to 2741 Ch 13 PowerPoint.pptx

Classroom testing-tailor made test
Classroom testing-tailor made testClassroom testing-tailor made test
Classroom testing-tailor made test
Sami Arif
 
Keys to Effective Testing
Keys to Effective TestingKeys to Effective Testing
Keys to Effective Testing
Vanz Justine
 
seminaronstanderdizedtest-171225110527.pdf
seminaronstanderdizedtest-171225110527.pdfseminaronstanderdizedtest-171225110527.pdf
seminaronstanderdizedtest-171225110527.pdf
CharlotteManamtam4
 

Similar to 2741 Ch 13 PowerPoint.pptx (20)

Classroom testing-tailor made test
Classroom testing-tailor made testClassroom testing-tailor made test
Classroom testing-tailor made test
 
NON STANDARDISED TEST.pptx
NON STANDARDISED TEST.pptxNON STANDARDISED TEST.pptx
NON STANDARDISED TEST.pptx
 
NON STANDARDISED TEST.pptx
NON STANDARDISED TEST.pptxNON STANDARDISED TEST.pptx
NON STANDARDISED TEST.pptx
 
Type of Test
Type of TestType of Test
Type of Test
 
Keys to Effective Testing
Keys to Effective TestingKeys to Effective Testing
Keys to Effective Testing
 
Test construction
Test constructionTest construction
Test construction
 
Assessment of-learning-ppt
Assessment of-learning-pptAssessment of-learning-ppt
Assessment of-learning-ppt
 
Objective type of test
Objective type of testObjective type of test
Objective type of test
 
TEST DEVELOPMENT AND EVALUATION (6462)
TEST DEVELOPMENT AND EVALUATION (6462)TEST DEVELOPMENT AND EVALUATION (6462)
TEST DEVELOPMENT AND EVALUATION (6462)
 
seminaronstanderdizedtest-171225110527.pdf
seminaronstanderdizedtest-171225110527.pdfseminaronstanderdizedtest-171225110527.pdf
seminaronstanderdizedtest-171225110527.pdf
 
Seminar on Standardized And Non Standardized Test.
Seminar on Standardized And Non Standardized Test.Seminar on Standardized And Non Standardized Test.
Seminar on Standardized And Non Standardized Test.
 
TEST DEVELOPMENT AND EVALUATION (6462)
TEST DEVELOPMENT AND EVALUATION (6462)TEST DEVELOPMENT AND EVALUATION (6462)
TEST DEVELOPMENT AND EVALUATION (6462)
 
Teaching Methodology "Evaluation and testing"
Teaching Methodology "Evaluation and testing"Teaching Methodology "Evaluation and testing"
Teaching Methodology "Evaluation and testing"
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Assessment of learning
Assessment of learningAssessment of learning
Assessment of learning
 
Assessment of learning.
Assessment of learning.Assessment of learning.
Assessment of learning.
 
Non standardized tests
Non  standardized tests Non  standardized tests
Non standardized tests
 
2741 Ch 15 PowerPoint.pptx
2741 Ch 15 PowerPoint.pptx2741 Ch 15 PowerPoint.pptx
2741 Ch 15 PowerPoint.pptx
 
TEST_CONSTRUCTION.pptx
TEST_CONSTRUCTION.pptxTEST_CONSTRUCTION.pptx
TEST_CONSTRUCTION.pptx
 
Assessment oflearning ppt
Assessment oflearning pptAssessment oflearning ppt
Assessment oflearning ppt
 

More from NickPalmisano2

More from NickPalmisano2 (20)

6741 Ch 13 Overview.pptx
6741 Ch 13 Overview.pptx6741 Ch 13 Overview.pptx
6741 Ch 13 Overview.pptx
 
6741 Ch 13 PowerPoint.pptx
6741 Ch 13 PowerPoint.pptx6741 Ch 13 PowerPoint.pptx
6741 Ch 13 PowerPoint.pptx
 
6741 Ch 12 Overview.pptx
6741 Ch 12 Overview.pptx6741 Ch 12 Overview.pptx
6741 Ch 12 Overview.pptx
 
6741 Ch 12 PowerPoint.pptx
6741 Ch 12 PowerPoint.pptx6741 Ch 12 PowerPoint.pptx
6741 Ch 12 PowerPoint.pptx
 
6741 Ch 11 Overview.pptx
6741 Ch 11 Overview.pptx6741 Ch 11 Overview.pptx
6741 Ch 11 Overview.pptx
 
6741 Ch 11 PowerPoint.pptx
6741 Ch 11 PowerPoint.pptx6741 Ch 11 PowerPoint.pptx
6741 Ch 11 PowerPoint.pptx
 
6741 Ch 10 Overview.pptx
6741 Ch 10 Overview.pptx6741 Ch 10 Overview.pptx
6741 Ch 10 Overview.pptx
 
6741 Ch 10 PowerPoint.pptx
6741 Ch 10 PowerPoint.pptx6741 Ch 10 PowerPoint.pptx
6741 Ch 10 PowerPoint.pptx
 
6741 Ch 9 Overview.pptx
6741 Ch 9 Overview.pptx6741 Ch 9 Overview.pptx
6741 Ch 9 Overview.pptx
 
6741 Ch 9 PowerPoint.pptx
6741 Ch 9 PowerPoint.pptx6741 Ch 9 PowerPoint.pptx
6741 Ch 9 PowerPoint.pptx
 
6741 Ch 8 Overview.pptx
6741 Ch 8 Overview.pptx6741 Ch 8 Overview.pptx
6741 Ch 8 Overview.pptx
 
6741 Ch 8 PowerPoint.pptx
6741 Ch 8 PowerPoint.pptx6741 Ch 8 PowerPoint.pptx
6741 Ch 8 PowerPoint.pptx
 
6741 Ch 7 Overview.pptx
6741 Ch 7 Overview.pptx6741 Ch 7 Overview.pptx
6741 Ch 7 Overview.pptx
 
6741 Ch 7 PowerPoint.pptx
6741 Ch 7 PowerPoint.pptx6741 Ch 7 PowerPoint.pptx
6741 Ch 7 PowerPoint.pptx
 
6741 Ch 6 Overview.pptx
6741 Ch 6 Overview.pptx6741 Ch 6 Overview.pptx
6741 Ch 6 Overview.pptx
 
6741 Ch 6 PowerPoint.pptx
6741 Ch 6 PowerPoint.pptx6741 Ch 6 PowerPoint.pptx
6741 Ch 6 PowerPoint.pptx
 
6741 Ch 5 Overview.pptx
6741 Ch 5 Overview.pptx6741 Ch 5 Overview.pptx
6741 Ch 5 Overview.pptx
 
6741 Ch 5 PowerPoint.pptx
6741 Ch 5 PowerPoint.pptx6741 Ch 5 PowerPoint.pptx
6741 Ch 5 PowerPoint.pptx
 
6741 Ch 4 Overview.pptx
6741 Ch 4 Overview.pptx6741 Ch 4 Overview.pptx
6741 Ch 4 Overview.pptx
 
6741 Ch 4 PowerPoint.pptx
6741 Ch 4 PowerPoint.pptx6741 Ch 4 PowerPoint.pptx
6741 Ch 4 PowerPoint.pptx
 

Recently uploaded

Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
AnaAcapella
 

Recently uploaded (20)

Single or Multiple melodic lines structure
Single or Multiple melodic lines structureSingle or Multiple melodic lines structure
Single or Multiple melodic lines structure
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
dusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningdusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learning
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
Philosophy of china and it's charactistics
Philosophy of china and it's charactisticsPhilosophy of china and it's charactistics
Philosophy of china and it's charactistics
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & Systems
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 
Simple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfSimple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdf
 

2741 Ch 13 PowerPoint.pptx

  • 1. Fire and Emergency Services Instructor 9th Edition Chapter 13 — Test Item Construction
  • 2. ‣ Describe common considerations for test instruments. Learning Objective 1
  • 3. Common Considerations for All Tests Test items must always be based on specific learning objectives; Level II Instructors must consider ‣ Test formatting and item arrangement ‣ Test item level of cognition and difficulty ‣ Test instructions and time requirements ‣ Testing bias
  • 5. Test Formatting and Item Arrangement
  • 6. ‣ Provide space for students to write their name and the date on either the test sheet or a separate answer sheet ‣ Provide a title or label at the top of the first page ‣ Number all tests and label different versions of the test — This will help with score reporting and test security ‣ Number all pages of the test — Students will be able to budget their time more wisely if they can see the length of the test Test Formatting and Item Arrangement
  • 7. ‣ Provide clear instructions at the beginning of the test and at the beginning of each section that uses a different type of test item (such as multiple-choice, matching, true-false, or fill-in-the-blank) ‣ Provide a sample test item, along with a sample answer, to show students how to respond to each item ‣ Number all items consecutively ‣ Single-space each test item, but double-space between items Test Formatting and Item Arrangement
  • 8. Test Formatting and Item Arrangement ‣ State the point value of each test item, for example ‣ Multiple-choice: 1 point each ‣ Short-answer: 2 points each ‣ Use commonly understood terms ‣ For example, do not use abbreviations unless they are placed in parentheses following the common term
  • 9. Test Formatting and Item Arrangement ‣ Test items must be arranged in a logical sequence, and can be grouped into two categories ‣ Learning domain outcome ‣ Knowledge ‣ Comprehension ‣ Application ‣ Type of test item ‣ Multiple-choice ‣ Matching ‣ Short-answer
  • 10. Test Formatting and Item Arrangement ‣ Instructors must make sure that the wording of one test item does not reveal the answer to another test item ‣ On computer-adaptive tests, such as those used in EMS, sequencing may be organized so that students can only progress to a more difficult question after correctly answering a simpler one
  • 11. Test Item Level of Cognition and Difficulty
  • 12. Test Item Level of Cognition and Difficulty ‣ Test items should evaluate the student’s ability at the level within the taxonomy that corresponds to the learning objective being evaluated; a variety of levels can exist within a course ‣ The actual determination of test difficulty is the responsibility of a Level III Instructor ‣ The instructor compiles scored tests, and evaluates students’ performance on individual questions ‣ The difficulty level of each question is then saved for future testing use and evaluation
  • 13. Test Instructions ‣ Purpose of the test ‣ Method and means for recording answers ‣ Recommendation whether to guess when undecided on an answer (in some cases, incorrect answers are penalized more than not answering the question) ‣ Amount of time available to complete the test
  • 14. Time Requirements for Answering Certain Types of Questions
  • 15. Time Requirements for Answering Certain Types of Questions
  • 16. Time Requirements for Answering Certain Types of Questions ‣ Instructors can use these estimates to calculate how much time students will need to complete the test ‣ May take the test, or ask another professional member of the organization to take the test, to see how much time is needed ‣ Tests should be an appropriate length to address the learning objectives that the test is intended to evaluate
  • 17. Time Requirements for Answering Certain Types of Questions ‣ When time is a restrictive factor, tests can emphasize the most critical learning objectives and include a sampling of less important objectives; this method of test construction is called sampling ‣ The plan must be documented to prove that necessary components were addressed ‣ The most critical objectives must be tested in each version of the test while the less critical objectives tested are included in a certain rotation
  • 18. Testing Bias ‣ Test items and testing instruments should not favor or penalize any particular group of students ‣ Ensuring that test questions very closely reflect the materials being tested is the best way to avoid bias in testing materials ‣ When students recognize that test items closely resemble the information that they have studied, they are more likely to perform confidently on tests regardless of their gender, cultural, ethnic, or regional backgrounds
  • 19. Testing Bias ‣ In the fire and emergency services, bias is generally limited to use of regional jargon and differences in terminology ‣ Example: local governments in the U.S. and Canada may be referred to as ‣ Counties or parishes ‣ Jurisdictions in legal terms
  • 20. Testing Bias ‣ Similarly, some fire apparatus may be referred to as a tanker or a tender depending on geographical region or the differences between departments ‣ The terminology on the test should reflect the terminology of the students and the materials from which they studied or received training
  • 21. ‣ Discuss various types of evaluation instruments used in fire and emergency service training. Learning Objective 2
  • 22. ‣ Deciding how to evaluate students can be a confusing and difficult topic for the Level II Instructor ‣ Instructor needs to establish the appropriate evaluation instrument to meet the parameters of the curriculum and course being presented ‣ In addition, once the instructor determines the type of instrument to use, they must then design the instrument to be fair and unbiased ‣ This process takes time, and new Level II Instructors may often underestimate the time commitment Student Evaluation Instruments
  • 25. Written Tests: Objective ‣ An objective test item is a question for which there is only one correct answer ‣ Judgment of the instructor or evaluator is not relevant and has no effect on assessment ‣ Objective items measure cognitive learning, but typically only at the lower levels of remembering and understanding
  • 26. Written Tests: Objective ‣ Properly constructed objective test items can also be used to measure higher levels of cognitive learning such as evaluation or creation ‣ Three main types of objective questions ‣ Multiple-choice ‣ True or false ‣ Matching
  • 27. Written Tests: Subjective ‣ A subjective test item has no single correct answer; the evaluator’s judgment may therefore affect assessment ‣ Subjective items are an effective way of measuring higher cognitive levels ‣ They allow students the freedom to organize, analyze, revise, redesign, or evaluate a problem
  • 28. Written Tests: Subjective ‣ The strength of a student’s response to these items depends on a variety of factors, such as ‣ How well they communicate their ideas ‣ Personal opinions of the evaluator
  • 29. Written Tests: Subjective ‣ There are three main types of subjective test items ‣ Short-answer or completion ‣ Essay ‣ Interpretive exercise
  • 30. Written Tests: Multiple Choice A multiple-choice test item consists of either a question or an incomplete statement, called the stem, plus a list of several possible responses, which are referred to as choices or alternatives
  • 31. Written Tests: Multiple Choice ‣ Students are tasked to read the stem and select the correct response from the list of alternatives ‣ The correct choice is known as the answer and the remaining choices are called distractors ‣ Distractors are used to discriminate between students who understand the subject matter well and those who are uncertain of the correct answer ‣ Distractors are not meant to trick, confuse, or mislead students
  • 32. ‣ Write the stem in the form of a direct question or an incomplete sentence that measures only one learning objective ‣ Write a clear, brief stem that contains most of the wording for the test item ‣ This helps to avoid placing repeated words in the alternatives ‣ Write positive questions as much as possible; be consistent in labeling negative words if and when negative statements are used Written Tests: Multiple Choice Guidelines
  • 33. ‣ Provide at least three plausible, attractive distractors ‣ Phrase the choices so that they are parallel and grammatically consistent with the stem ‣ Place correct answers in varied positions among the A, B, C, and D choices ‣ Place each choice on a separate, indented line, and in a single column ‣ Begin responses with capital letters when the stem is a complete question Written Tests: Multiple Choice Guidelines
  • 34. ‣ Begin responses with lowercase letters when the stem is an incomplete sentence ‣ Do not include choices that are obviously wrong or intended to be humorous ‣ Make sure that stems and alternatives do not give students grammatical clues as to the correct response ‣ Make all alternatives close to the same length Written Tests: Multiple Choice Guidelines
  • 35. ‣ Avoid using the phrases ‣ All of the above ‣ None of the above ‣ Do not test trivial ideas or information ‣ Use correct grammar and punctuation Written Tests: Multiple Choice Guidelines
  • 36. Written Tests: Multiple Choice Disadvantages ‣ They are not well suited to measuring certain cognitive skills, such as organizing and presenting ideas; essay tests are more effective for this purpose ‣ Depending on the test writer’s skill, this type of test may not include different difficulty-level test items that measure a variety of cognitive learning levels ‣ Creating appropriate and plausible distractors for each stem can require significant time and thought ‣ Students who do not know the material may still be able to guess the correct answer
  • 37. ‣ The true-false test item is a single statement that the student must determine to be either true or false ‣ It is difficult to construct a statement that is completely true or completely false Written Tests: True-False
  • 38. Written Tests: True-False ‣ True statements should be based on facts ‣ False statements should be based on common misconceptions of the facts ‣ In addition to the traditional true-false test items, there are also modified true-false test items ‣ Modified true-false items ask the student to explain why an item is false or to rewrite the item to make it true
  • 39. Written Tests: True-False ‣ One limitation of true-false questions ‣ Students tend to remember the false items on the test as being true, known as the negative suggestion effect ‣ Instructors should review the correct answers to true-false questions with students after scoring the test to help combat this effect
  • 40. Written Tests: True-False Guidelines ‣ Write the words True and False at the left margin if students must mark their answers on the test paper ‣ On computer-scored answer sheets ‣ True may be assigned to A ‣ False is assigned to B ‣ Provide clear instructions so that students know how to respond to each statement
  • 41. Written Tests: True-False Guidelines ‣ Create enough test items to provide reliable results ‣ For reliability purposes, more true/false items are needed than the number used for multiple-choice items ‣ A large number of test items minimizes the possibility of guessing the correct answers ‣ Distribute true and false items randomly
  • 42. Written Tests: True-False Guidelines ‣ Avoid determiners (words that indicate a specific answer) that provide unwarranted clues ‣ Words such as usually, generally, often, or sometimes are most likely to appear in true statements ‣ The words never, all, always, or none are more likely to be found in false statements ‣ Avoid creating items that could trick or mislead students into making a mistake
  • 43. Written Tests: True-False Guidelines ‣ Ensure only one correct answer is possible ‣ Avoid double-negative test items; they are very confusing to students and do not accurately measure knowledge ‣ Avoid using personal pronouns such as "you“ ‣ Do not use test items that test trivia or obscure facts ‣ Develop test items that require students to think about what they have learned, rather than merely remember it
  • 44. Written Tests: True-False Guidelines ‣ Avoid unusually long or short test items, because the length may be a clue; true items are often longer than false items, because they include a justification ‣ Create brief, simply stated test items that deal with a single concept; avoid lengthy, complex items that address more than one concept ‣ Avoid quoting information directly from the textbook
  • 45. Written Tests: Matching ‣ Matching test items consist of two parallel columns of words, phrases, images, or a combination of these ‣ In the most common example, students must match a word from the left column with its definition from the right column
  • 46. Written Tests: Matching The content of a matching test item must consist of similar material, items, or information
  • 48. Written Tests: Matching Guidelines ‣ Avoid placing each group of prompts and the list of responses on more than one page ‣ Separate matching sections into sets of five problems and responses when using computer or mechanically scored answer sheets ‣ Consider preparing one more response than there are prompts; the extra response requires more precise knowledge and prevents students from finding an answer by eliminating all the other possible answers
  • 49. Written Tests: Matching Guidelines ‣ Number the problem statements; place an answer line to the left of each number unless a separate answer sheet is used ‣ Use letters for each response ‣ Arrange problem statements and responses into two columns ‣ Problem statements on the left side of the page ‣ Responses on the right ‣ Columns may be titled with appropriate headings, such as Tools and Uses, or Symptoms and Treatments
  • 51. NOTE ‣ Instructors should be advised that matching test items may be more effectively and efficiently written as a series of multiple-choice questions.
  • 52. Written Tests: Short-answer/Completion ‣ A short-answer item is a question for which students must write a correct response ‣ To do so, they must recall previously learned information, apply relevant principles, or understand methods or procedures ‣ Short-answer items are often subjective
  • 53. Written Tests: Short-answer/Completion ‣ A completion item should be objective ‣ This type of test item is a statement in which key words are replaced with an underlined blank space that students are tasked to fill in
  • 55. Written Tests: Short-answer/Completion Guidelines ‣ On completion test items, create short, direct statements for which only one answer is possible ‣ Avoid long statements with a string of blanks to fill ‣ Start with a direct question and change it to an incomplete statement ‣ Make sure that the desired response is a key point in the lesson ‣ Arrange the statement with the blanks at or near the end of the sentence
  • 56. Written Tests: Short-answer/Completion Guidelines ‣ Avoid statements that call for answers with more than one word, phrase, or number ‣ Eliminate unnecessary clues, such as answer blanks that vary in length or the use of the words "a" or "an" preceding the blank ‣ Write a rubric or detailed answer sheet so that the scorer understands the full extent of possible, acceptable answers to the questions
  • 57. Written Tests: Essay ‣ Like short-answer test items, essays are subjective ‣ Students must construct an in-depth answer on a topic or question related to a key aspect of the course material
  • 58. Written Tests: Essay ‣ The strength of this item type is that it tests the students’ higher level cognitive processes ‣ Students are expected to demonstrate the ability to analyze a topic, create a solution to a problem, or evaluate a system or process ‣ Essay tests eliminate guessing, because students must know the material thoroughly in order to write an effective essay ‣ Creative students often prefer this type of test item because it allows them a forum to express their perspective of a topic
  • 59. Written Tests: Essay Disadvantages ‣ Essays are time-consuming for students to complete and instructors to score ‣ Differences in students’ writing ability, penmanship, spelling, and grammar may affect an instructor’s ability to easily score the test ‣ Students who have difficulty writing or write slowly will be at a disadvantage, especially in a timed test
  • 60. Written Tests: Essay Guidelines ‣ Choose essay topics that reflect key aspects of the course material ‣ Create a rubric that establishes clear scoring guidelines ‣ For each essay question, provide clear instructions that define how students should respond, how much time they should spend responding and how many pages or paragraphs each response should be ‣ Provide sufficient time for students to respond to all questions
  • 61. Written Tests: Interpretive Exercises ‣ The interpretive exercise is another subjective test item that measures higher level cognitive processes ‣ An exercise consists of introductory material, typically numerical data, a graph, or a paragraph of text, followed by a series of test items ‣ Students read the text or look at the illustrations, then answer the test items, which may be any of the types described in this chapter
  • 62. Written Tests: Interpretive Exercises Rules ‣ Make sure that all introductory material relates to key learning objectives, and is as concise as possible ‣ Apply relevant guidelines for effective item construction for each test item ‣ Use test items that require the same type of performance that is listed in the test specifications for the various learning objectives ‣ Create original introductory material unfamiliar to students
  • 63. Written Tests: Interpretive Exercises Rules ‣ Ensure that the introductory material does not give away the answer to any of the test items ‣ Encourage students to read the introductory material to be able to answer test items ‣ Provide enough test items, using a variety of item types, to effectively measure students’ understanding of the material
  • 64. Oral Tests Open or closed questions ‣ When the purpose of the test is to determine knowledge, the questions should be closed, requiring only a single brief answer ‣ When the purpose is to determine how a student responds under pressure, the question should judge both accuracy and presentation; In this case, the questions should be open, permitting longer answers that may lead to further questions
  • 65. Oral Tests ‣ Oral tests can be very stressful for students ‣ Instructors should provide a relaxed, comfortable atmosphere for the presentation of oral tests
  • 66. Oral Tests: Important Aspects of Designing and Conducting Oral Tests
  • 67. Oral Tests ‣ Oral tests are highly subjective, especially when the questions may be answered a number of ways ‣ To reduce evaluator bias, test developers should provide a scoring rubric that lists all possible correct answers ‣ An oral test is the most valid and reliable way to test a student’s ability to verbally communicate ideas, concepts, and processes; may be the best measure of a student’s judgment and thought processes
  • 70. Performance (Skills) Tests ‣ Assessment is based on either a speed standard such as timed performance, a quality standard such as minimum acceptable performance, or both ‣ Performance tests require students to demonstrate psychomotor proficiency after appropriate practice or drill sessions ‣ Tests must take place under controlled conditions so instructors can make reliable, valid judgments about student performance
  • 71. Performance (Skills) Tests: Guidelines ‣ Specify performance objectives to be measured ‣ Select rating factors on which the test will be judged ‣ Provide written instructions that clearly explain the test situation ‣ Confirm a new performance test with other instructors or previous students before administering it to students
  • 72. Performance (Skills) Tests: Guidelines ‣ Use more than one test evaluator ‣ Follow established procedures when administering the test ‣ Make a score distribution chart after tests have been administered and graded ‣ Rotate team members to every position for team evaluation ratings
  • 77. ‣Explain the steps for test planning. Learning Objective 3
  • 79. Determining Test Purpose and Classification
  • 80. Determining Test Purpose and Classification
  • 81. Determining Test Purpose and Classification
  • 82. Determining Test Purpose and Classification
  • 83. Is designed to ‣ Determine readiness for instruction or placement in the appropriate instructional level (prescriptive or placement test) ‣ Measure improved progress or identify learning problems that are hampering progress (formative or progress test) ‣ Rate terminal performance (summative or comprehensive test) Determining Test Purpose and Classification: Planning Considerations
  • 84. ‣ Whether the test measures technical knowledge retention and recall in the cognitive domain (written or oral tests) ‣ Measures manipulative skills in the psychomotor domain (performance or skill tests) ‣ Measures behavioral changes in attitude, values, or beliefs in the affective domain (written or oral tests) Determining Test Purpose and Classification: Planning Considerations
  • 89. Selecting Proper Level of Test Item Difficulty
  • 90. Selecting Proper Level of Test Item Difficulty
  • 91. Selecting Proper Level of Test Item Difficulty
  • 92. Selecting Proper Level of Test Item Difficulty
  • 93. Determining the Appropriate Number of Test Items
  • 94. Determining the Appropriate Number of Test Items
  • 95. Determining the Appropriate Number of Test Items
  • 98. Avoiding Giving Clues to Test Answers
  • 99. Avoiding Giving Clues to Test Answers
  • 100. Avoiding Giving Clues to Test Answers
  • 104. Ensuring Validity and Reliability
  • 105. Ensuring Validity and Reliability
  • 106. Ensuring Validity and Reliability
  • 107. Ensuring Validity and Reliability: Steps
  • 108. Ensuring Validity and Reliability: Steps
  • 109. Ensuring Validity and Reliability
  • 110. Ensuring Validity and Reliability
  • 111. ‣ Describe the process to select a test scoring method. Learning Objective 4
  • 112. Test Scoring Method Selection
  • 113. Test Scoring Method Selection: Written Tests
  • 114. Test Scoring Method Selection: Oral Tests
  • 115. Test Scoring Method Selection: Performance or Skill Tests
  • 116. Test Scoring Method Selection: Performance or Skill Tests
  • 117. Test Scoring Method Selection