The Stages inTest Design,
Construction and
Administration
CRMEF Souss Massa
Module: Testing &
Assessment
Trainer: Prof. Ayad Chraa
Presented by: Kaoutar El
Assri & Hamza Bellahammou
Academic year: 2024/2025
Before jumping intoitem writing,
it's important to remember that
writing test items is not the starting
point of test design. As Fulcher
(2009) emphasizes, effective test
development begins with a clear
test purpose, not with tasks.
4.
Why am Itesting????
1_IDENTIFYING PURPOSES
Formative Assessment
Diagnostic Assessment
Summative
Assessment
Placement test
Each purpose influences the design and content of the
test.
5.
Why am Itesting????
2_IDENTIFYING
OBJECTIVES
• Based on syllabus, outcomes, or teaching
aims.
• Should reflect what learners should be
able to do (skills, knowledge,
performance).
🔹 Hughes emphasizes that objectives
must be clearly defined to ensure test
validity.
6.
3_Choosing the Format
OBJECTIVetest
items
SUBJECTIVe test
items
Essay Questions
Short Answer
Questions
Paragraph Writing
These require
constructed
responses and
involve personal
judgment in
scoring.
Performance-Based Tasks
Oral Presentations or
Interviews
Role-plays
Listening and
Responding
Project Work
These assess real-world
language use, especially
in productive skills like
speaking and writing.
Multiple Choice
Questions (MCQs)
True/False
Matching
Fill in the blanks
These items have a
single correct answer
and are easy to score
consistently.
7.
4_Selecting Content Areasand
Skills to be Assessed
• Choose topics and language functions already
taught.
• Ensure balance in the skills covered: listening,
speaking, reading, writing.
• Example: If the unit is about travel, test
vocabulary, grammar (e.g., past tense), and
speaking about past trips.
Writing Test Items
1_EnsuringClarity and
Appropriateness
Use simple, clear
instructions.
Avoid confusing or
misleading language.
11.
Writing Test Items
2_AvoidingAmbiguity and Bias
Each item should have
one clear correct answer.
Avoid cultural, gender, or
regional bias.
12.
Reviewing and RevisingTest Items
• Check for alignment with
objectives.
• Revise for clarity, balance,
and fairness.
• Ask a peer or a colleague
to help you.
13.
Pre-testing and Piloting
Ifpossible, try the test with a small group
of students from the same level or class.
If that's not feasible, pilot it yourself:
Read the test aloud and answer each
item as if you were a student.
Check timing: Is it too long? Too short?
Verify clarity: Are the instructions and
questions understandable?
Stages to followwhen designing a TEST.
D. Analyzing Item
Performance
16.
Stages to followwhen designing a TEST.
This step involves checking how well each test question worked. You
analyze:
• Difficulty level – How many students answered each item correctly.
• Discrimination index – How well an item distinguishes between high-
and low-performing students.
Example:
If 90% of students get a question right, it might be too easy. If top-
performing students mostly got one question right while low
performers didn’t, that item has good discrimination.
vocabulary, and task completion.
Stages to followwhen designing a TEST.
Once analysis is done, revise or remove weak items, check for clarity,
layout, and ensure scoring rubrics are ready.
Example:
After reviewing, you find that two vocabulary questions confuse many
students due to unclear wording. You revise them for clarity and
ensure that all tasks are aligned with the learning goals.
19.
Stages to followwhen designing a TEST.
F. Ensuring Validity and
Reliability
20.
Stages to followwhen designing a TEST.
• Validity: The test measures what it’s supposed to measure (e.g.,
grammar if that’s the goal).
• Reliability: The test gives consistent results across time or raters.
Example:
To ensure validity, include questions only on taught content like the
Present Simple. To improve reliability, use a consistent rubric for writing
tasks with clear scoring categories like grammar, vocabulary, and task
completion.
21.
• Test contentselection and balance
(test validity)
• Appropriate item types (matching
form to function)
• Number of items and timing
(practicality)
• Clear scoring rubrics and
organization (reliability and fairness)
• Guidelines for evaluating test quality
(layout, clarity, item difficulty)
Before the Test:
•Tell students what the test
covers, key topics, and its length.
• Encourage a quick review: skim
notes, highlight main ideas.
• Offer practice questions if
possible.
• Remind them to arrive early and
be prepared.
25.
During the Test:
•Ask students to quickly scan the
test before starting.
• Remind them to stay focused.
• Give a time warning near the end
for checking answers.
• Monitor the room to ensure
fairness and prevent cheating.
26.
After the Test:
•Give feedback on strengths and
areas to improve.
• Discuss the results in class and
invite questions.
• Encourage students to focus on
weak points moving forward.
27.
• Brown, H.D. (2004). Language assessment: Principles and
classroom practices. Pearson Education.
• Hughes, A. (2003). Testing for language teachers (2nd ed.).
Cambridge University Press.
• Harmer, J. (2007). The practice of English language teaching
(4th ed.). Pearson Longman.
• Moroccan Guidelines for Middle School Teaching
30.
1. According toFulcher (2009), what should
be the first step in test design?
A. Writing multiple choice questions
B. Choosing test format
C. Identifying the test purpose
D. Selecting test content
2. Why isit important to define test
objectives clearly, as emphasized by Hughes?
A. To reduce grading time
B. To ensure test validity
C. To create difficult items
D. To increase test length
3. Which ofthe following is considered a
subjective test item?
A. Multiple Choice Question
B. True/False Item
C. Paragraph Writing
D. Matching Exercise
4. What isone way to improve a test’s clarity
and fairness before finalizing it?
A. Add more complex vocabulary
B. Test it on a different level group
C. Avoid peer review
D. Pre-test or pilot the test
What does "itemdifficulty" refer to when
analyzing test performance?
A. How long the test takes
B. How many students answer an item
correctly ✅
C. The type of test question
D. The student’s opinion about the question
39.
What is thepurpose of the "discrimination
index"?
A. To rank students by age
B. To adjust scores
C. To show if a question separates high and
low performers ✅
D. To test vocabulary knowledge
40.
What should bedone during the 'Finalizing
the Test' stage?
A. Grade the tests
B. Remove weak questions and check layout
✅
C. Give practice tests
D. Print student results
41.
What should studentsdo first when the test
is handed out?
A. Start with the last question
B. Leave the room
C. Quickly scan the test ✅
D. Discuss with a partner