Test Design, Construction,
and Administration
CRMEF Souss Massa
Module: Testing and Assessment
Department of English
Prepared by:
Hassan BOUKSIM
Elyazid BENHMAIDA
Trainer:
Prof. Ayad CHRAA
Outline:
1.Test Construction
2.Test Administrtion
3.Test Scoring
Test Construction
01.
Designing a test
P
r
e
p
a
ration
Perf
o
r
m
a
n
c
e
F
e
e
d
b
a
c
k
According to Ur (1996) , There are three stages in designing a test:
Designing Tests
Brown (2003) suggests that five key questions that need to be
considered when designing or revising tests concern:
a) The purpose of the test
b) Its objectives
c) The way in which the test’s purpose and objectives are
reflected in the test specifications
d) The ways in which the tasks in the test are selected and
separated
e) The kinds of grading or feedback and scoring that are
expected.
A test designed to measure capacity or general ability to learn a
foreign language and ultimate success in that undertaking.
A test that measures a learner's overall ability in a language
without being tied to a specific course, textbook, or curriculum.
Language Aptitude Test
Proficiency Test
Achievement Test
Diagnostic Test
Placement Test
A test designed to place a student into a particular level, course
or section of a language programme and to assign students to
the class that is most suitable for their level.
A test that provides more detailed information that can be used
in planning instruction that matches the students’ needs.
A test that measures to what extent students have acquired
language features that have already been taught.
Designing Tests: Purpose
Designing Tests: Objetives
What is it you want to test.
Take a careful look at everything you think your students
should be able to do based on the material that students
are responsible for.
Examine the objectives of the units you are testing.
Designing Tests:Test Specifications
Test Specs refers to the following points:
What skills will be included?
What types of items and tasks it will include?
How many items will be included in each part of the test?
How much time will be allocated to each item?
How the test will be scored?
Developing Test: Test Specifications
Davidson and Lynch (2002) provide a useful format for test specification,
which is summarized as follows:
1.General description (GD): A brief general statement of the behaviour to be tested,
similar to the learning objective.
2. Prompt attributes (PA): A complete and detailed description of what the student
will see in the test.
3. Response attributes (RA): A complete and detailed description of the way in
which the student will provide the answer, and what will constitute failure or success.
4. Sample item (SI): An illustrative item or task that the specification should generate.
5. Specification supplement (SS): A specification of any additional information
needed to construct test items, such as a list of sources from which reading passages
may be selected in a reading test.
Developing Test Specifications
Developing Tests: Devising Test items
Discrete points tests:
reflect the view that language consists of different components
(e.g. grammar, pronunciation, vocabulary) and different skills
(e.g. listening, speaking, reading, writing), and these can be
tested separately.
Multiple-choice tests
01
True–false tests
Gap-filling items
Developing Tests: Devising Test items
Integrative tests:
require the learner to use several different skills at the same
time, and are thought to better capture the knowledge that
underlies authentic use of language.
gap-fill (cloze) tests
02
dictation tests
Developing Tests: Devising Test items
Integrative tests:
In a cloze test, ______ number of words are ______ from a passage,
and ________ learners have to complete _____ missing words. The
words be _______ deleted regularly (e.g. _____ fifth word), or deletions
may ______ based on what the ________ is designed to measure.
_______ tests are said to ________ integrative tests, since they
________ upon the learners’ knowledge ________ grammar,
vocabulary and text _________ .
gap-fill (cloze) tests
02
Developing Tests: Devising Test items
Integrative tests:
In a cloze test, a number of words are deleted from a passage, and the
learners have to complete the missing words. The words may be
deleted regularly (e.g. every fifth word), or deletions may vary based on
what the test is designed to measure. Cloze tests are said to be
integrative tests, since they draw upon the learners’ knowledge of
grammar, vocabulary and text cohesion.
gap-fill (cloze) tests
02
Test Administration
02.
Questions to keep in mind before
testing
How far in advance do you announce the test?
How much do you tell the class about what is going to be in
it, and about the criteria for marking?
How much information do you need to give them about the
time, place, any limitations or rules?
Do you give them any ‘tips’ about how best to cope with the
test format?
Do you expect them to prepare at home, or do you give
them some class time for preparation?
Administering the Test:
Administering the Test:
Announcing the test at least a week in advance
(give details of where, when, and how).
Telling the class precisely what what material
is to be tested, what sort of items will be used,
and how answers will be assessed.
Giving “test-tips” to students.
Having a revision with students to help them
with pre-test learning.
Before the test:
Questions to keep in mind before
testing
How important is it for you yourself to administer the test?
Assuming that you do, what do you say before giving out
the test papers?
Do you add anything when the papers have been
distributed but students have not yet started work?
During the test, are you absolutely passive or are you
interacting with the students in any way?
Administering the Test:
Administering the Test:
The teacher administers the test himself.
Reminding the students about the content,
format, and marking system.
Running through instructions to make sure
everything is clear.
AND MOST IMPORTANTLY: WISHING THEM
“GOOD LUCK” :)
Giving the test:
Questions to keep in mind before
testing
How long does it take you to mark and return the papers?
Do you then go through them in class?
Do you demand any follow-up work on the part of the
students?
Administering the Test:
Administering the Test:
Marking and returning the tests as quickly as
possbile (within a week).
Going through the answers in class, fairly.
After the test:
Test Scoring
03.
What is scoring in testing?
Scoring is the process of
assigning values (numerical or
descriptive) to students'
responses to assess their
performance.
It determines how well a learner
has achieved the objectives of
the test.
Types of scoring methods
Objective Scoring:
Objective scoring is used when test
items have one clearly correct
answer. The scoring process is
mechanical or rule-based, requiring
no examiner judgment.
Examples:
Multiple Choice Questions
(MCQs)
True/False Items
Matching Items
Cloze Tests (with a fixed answer
key)
Grammar/vocabulary gap-fill
items
Subjective Scoring
Subjective scoring is used when
answers are open-ended and
require human judgment. These are
typically tasks with no single correct
answer.
Examples
Essay writing
Descriptive or narrative writing
tasks
Speaking tasks (monologues or
conversations)
Interpretive reading responses
Translation or paraphrasing
exercises
Types of scoring methods
Holistic vs. Analytical Scoring
Holestic Scoring
Holistic scoring assigns a single
overall score to a student’s
performance based on a general
impression of the quality of the
response.
Process of scoring
The rater reads or listens to the
full response.
The response is judged as a
whole, considering overall
effectiveness, fluency, and
coherence.
A rubric with broad band
descriptors is typically used
(e.g., scores from 1 to 5).
Holistic vs. Analytical Scoring
Analytical Scoring
Analytic scoring breaks the student’s
response into separate components,
and assigns a score to each.
Common components include
content, organization, grammar,
vocabulary, and mechanics.
Process of scoring
Each criterion is clearly
described in a scoring rubric.
The final score is the sum or
average of scores across all
categories.
A writing task gets:
4/5 for Content
3/5 for Organization
2/5 for Grammar
3/5 for Vocabulary
Total: 12/20
Functions:
Express likes and dislikes
Inviting, accepting and declining
invitations
Agreeing and disagreeing
Writing:
Use capitalization and
punctuation correctly
Grammar:
Comparatives and superlatives
Future with present continuous
and be going to, will
Vocabulary:
different types of recreation
activities (outdoors and indoors)
Unit 5:Recreaction
Test Specs refers to the following points:
What skills will be included?
What types of items and tasks it will include?
How many items will be included in each part of
the test?
How much time will be allocated to each item?
How the test will be scored?
References
Brown, H. D. (2004). Language Assessment: Principles and Classroom Practices.
Pearson.
Weir, C. J. (2005). Language testing and validation: An evidence-based approach.
Palgrave Macmillan.
Richards, J. C. (2015). Key issues in language teaching. Cambridge University
Press.
Ur, P. (1996). A course in Language Teaching: Practice and theory. http://
ci.nii.ac.jp/ncid/BA27561374
Thank you!!

Test Construction, Administration and Scoring .pdf

  • 1.
    Test Design, Construction, andAdministration CRMEF Souss Massa Module: Testing and Assessment Department of English Prepared by: Hassan BOUKSIM Elyazid BENHMAIDA Trainer: Prof. Ayad CHRAA
  • 2.
  • 3.
  • 4.
    Designing a test P r e p a ration Perf o r m a n c e F e e d b a c k Accordingto Ur (1996) , There are three stages in designing a test:
  • 5.
    Designing Tests Brown (2003)suggests that five key questions that need to be considered when designing or revising tests concern: a) The purpose of the test b) Its objectives c) The way in which the test’s purpose and objectives are reflected in the test specifications d) The ways in which the tasks in the test are selected and separated e) The kinds of grading or feedback and scoring that are expected.
  • 6.
    A test designedto measure capacity or general ability to learn a foreign language and ultimate success in that undertaking. A test that measures a learner's overall ability in a language without being tied to a specific course, textbook, or curriculum. Language Aptitude Test Proficiency Test Achievement Test Diagnostic Test Placement Test A test designed to place a student into a particular level, course or section of a language programme and to assign students to the class that is most suitable for their level. A test that provides more detailed information that can be used in planning instruction that matches the students’ needs. A test that measures to what extent students have acquired language features that have already been taught. Designing Tests: Purpose
  • 7.
    Designing Tests: Objetives Whatis it you want to test. Take a careful look at everything you think your students should be able to do based on the material that students are responsible for. Examine the objectives of the units you are testing.
  • 8.
    Designing Tests:Test Specifications TestSpecs refers to the following points: What skills will be included? What types of items and tasks it will include? How many items will be included in each part of the test? How much time will be allocated to each item? How the test will be scored?
  • 9.
    Developing Test: TestSpecifications Davidson and Lynch (2002) provide a useful format for test specification, which is summarized as follows: 1.General description (GD): A brief general statement of the behaviour to be tested, similar to the learning objective. 2. Prompt attributes (PA): A complete and detailed description of what the student will see in the test. 3. Response attributes (RA): A complete and detailed description of the way in which the student will provide the answer, and what will constitute failure or success. 4. Sample item (SI): An illustrative item or task that the specification should generate. 5. Specification supplement (SS): A specification of any additional information needed to construct test items, such as a list of sources from which reading passages may be selected in a reading test.
  • 10.
  • 11.
    Developing Tests: DevisingTest items Discrete points tests: reflect the view that language consists of different components (e.g. grammar, pronunciation, vocabulary) and different skills (e.g. listening, speaking, reading, writing), and these can be tested separately. Multiple-choice tests 01 True–false tests Gap-filling items
  • 12.
    Developing Tests: DevisingTest items Integrative tests: require the learner to use several different skills at the same time, and are thought to better capture the knowledge that underlies authentic use of language. gap-fill (cloze) tests 02 dictation tests
  • 13.
    Developing Tests: DevisingTest items Integrative tests: In a cloze test, ______ number of words are ______ from a passage, and ________ learners have to complete _____ missing words. The words be _______ deleted regularly (e.g. _____ fifth word), or deletions may ______ based on what the ________ is designed to measure. _______ tests are said to ________ integrative tests, since they ________ upon the learners’ knowledge ________ grammar, vocabulary and text _________ . gap-fill (cloze) tests 02
  • 14.
    Developing Tests: DevisingTest items Integrative tests: In a cloze test, a number of words are deleted from a passage, and the learners have to complete the missing words. The words may be deleted regularly (e.g. every fifth word), or deletions may vary based on what the test is designed to measure. Cloze tests are said to be integrative tests, since they draw upon the learners’ knowledge of grammar, vocabulary and text cohesion. gap-fill (cloze) tests 02
  • 15.
  • 16.
    Questions to keepin mind before testing How far in advance do you announce the test? How much do you tell the class about what is going to be in it, and about the criteria for marking? How much information do you need to give them about the time, place, any limitations or rules? Do you give them any ‘tips’ about how best to cope with the test format? Do you expect them to prepare at home, or do you give them some class time for preparation? Administering the Test:
  • 17.
    Administering the Test: Announcingthe test at least a week in advance (give details of where, when, and how). Telling the class precisely what what material is to be tested, what sort of items will be used, and how answers will be assessed. Giving “test-tips” to students. Having a revision with students to help them with pre-test learning. Before the test:
  • 18.
    Questions to keepin mind before testing How important is it for you yourself to administer the test? Assuming that you do, what do you say before giving out the test papers? Do you add anything when the papers have been distributed but students have not yet started work? During the test, are you absolutely passive or are you interacting with the students in any way? Administering the Test:
  • 19.
    Administering the Test: Theteacher administers the test himself. Reminding the students about the content, format, and marking system. Running through instructions to make sure everything is clear. AND MOST IMPORTANTLY: WISHING THEM “GOOD LUCK” :) Giving the test:
  • 20.
    Questions to keepin mind before testing How long does it take you to mark and return the papers? Do you then go through them in class? Do you demand any follow-up work on the part of the students? Administering the Test:
  • 21.
    Administering the Test: Markingand returning the tests as quickly as possbile (within a week). Going through the answers in class, fairly. After the test:
  • 22.
  • 23.
    What is scoringin testing? Scoring is the process of assigning values (numerical or descriptive) to students' responses to assess their performance. It determines how well a learner has achieved the objectives of the test.
  • 24.
    Types of scoringmethods Objective Scoring: Objective scoring is used when test items have one clearly correct answer. The scoring process is mechanical or rule-based, requiring no examiner judgment. Examples: Multiple Choice Questions (MCQs) True/False Items Matching Items Cloze Tests (with a fixed answer key) Grammar/vocabulary gap-fill items
  • 25.
    Subjective Scoring Subjective scoringis used when answers are open-ended and require human judgment. These are typically tasks with no single correct answer. Examples Essay writing Descriptive or narrative writing tasks Speaking tasks (monologues or conversations) Interpretive reading responses Translation or paraphrasing exercises Types of scoring methods
  • 26.
    Holistic vs. AnalyticalScoring Holestic Scoring Holistic scoring assigns a single overall score to a student’s performance based on a general impression of the quality of the response. Process of scoring The rater reads or listens to the full response. The response is judged as a whole, considering overall effectiveness, fluency, and coherence. A rubric with broad band descriptors is typically used (e.g., scores from 1 to 5).
  • 27.
    Holistic vs. AnalyticalScoring Analytical Scoring Analytic scoring breaks the student’s response into separate components, and assigns a score to each. Common components include content, organization, grammar, vocabulary, and mechanics. Process of scoring Each criterion is clearly described in a scoring rubric. The final score is the sum or average of scores across all categories. A writing task gets: 4/5 for Content 3/5 for Organization 2/5 for Grammar 3/5 for Vocabulary Total: 12/20
  • 28.
    Functions: Express likes anddislikes Inviting, accepting and declining invitations Agreeing and disagreeing Writing: Use capitalization and punctuation correctly Grammar: Comparatives and superlatives Future with present continuous and be going to, will Vocabulary: different types of recreation activities (outdoors and indoors) Unit 5:Recreaction Test Specs refers to the following points: What skills will be included? What types of items and tasks it will include? How many items will be included in each part of the test? How much time will be allocated to each item? How the test will be scored?
  • 29.
    References Brown, H. D.(2004). Language Assessment: Principles and Classroom Practices. Pearson. Weir, C. J. (2005). Language testing and validation: An evidence-based approach. Palgrave Macmillan. Richards, J. C. (2015). Key issues in language teaching. Cambridge University Press. Ur, P. (1996). A course in Language Teaching: Practice and theory. http:// ci.nii.ac.jp/ncid/BA27561374
  • 30.