Testing and Test construction (Evaluation in EFL)

  • 6,040 views
Uploaded on

 

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • Brilliant. I like the break-down of test types. Could you direct me to any literature you've used? Regards, K.
    Are you sure you want to
    Your message goes here
  • thank u for this
    Are you sure you want to
    Your message goes here
  • Thanks or the post!
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
6,040
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
3
Likes
6

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1.  
  • 2.  
  • 3. What is testing? It’s an activity whose purpose is to determine what learners can do or know about something. What is a test? It’s a formal instrument to measure what candidates can do or know about something.  
  • 4.
    • What are tests for?
    • To inform learners and teachers of the strengths and weaknesses of the process.
    • To motivate learners to review or consolidate specific material.
    • To create a sense of accomplishment/success.
    • To guide the planning/development of the ongoing teaching process.
    • To determine if (and to what extent) the objectives have been achieved.
    • To encourage improvement.
    •  
  • 5.  
  • 6. Depending on purpose   Depending on characteristics Screening/Selection/ Admission Direct Tests/ Indirect Tests Placemen t Discrete point/ Integrative tests Proficiency Criteria-referenced/ Norm-referenced Aptitude Objective tests/ Subjective tests  Diagnostic Speed test/ Power test Achievement Knowledge tests/ Skill tests Progress
  • 7.
    • Depending on purpose :
    •  
    • Screening/Selection/Admission : To know if a person has the required behavior to be successful in a specific program (not based on objectives), e.g. IPC’s admission test.
    • Placemen t : To determine the level in which a person should be located inside a program (designed by the institution), e.g. CVA’s placement test.
    • Proficiency : To know if a person shows an overall proficiency in a language, compared to native speakers in real life contexts, e.g. The TOEFL test .
    •  
  • 8.
    •  
    • Aptitude : To know the talents of a person to do something specific. Suitability of a candidate for a specific program of instruction.
    • Diagnostic : It refers to entrance behavior or previous knowledge. To determine strengths and weaknesses and to guarantee that potential problems will be corrected (performed by the teacher).
    • Achievement : To know if a determined objective has been covered successfully.
    • Progress : To check improvement achieved according to a referential point in a program.
  • 9.
    •  
    • Depending on characteristics :
    •  
    • Direct Tests : they test what they are intended to assess in a straightforward manner.
    • Indirect Tests : they give information about aspects that are not the focus but are implicitly addressed (a reading comprehension cloze may give an indirect measure of vocabulary knowledge).
    • Discrete point tests : the focus is on restricted areas of the target language (a cloze test on verb tenses) .
    • Integrative tests : answers demand the combination of many areas of language knowledge to generate the product demanded. ( oral interviews, reading comprehension, essay writing, etc.) 
    •  
    •  
    •  
  • 10.
    • Criterion-referenced : these exams describe what a person can do in relation to the course objectives or predefined criteria . There is no comparison between students.
    • Norm-referenced : test results are compared so as to measure one person’s performance in relation to a given population.
    • Objective tests : no judgment is involved. Answers are either right or wrong. (e.g. multiple choice items )
    • Subjective tests : judgment and opinions on the part of the rater are involved. No right or wrong answer, but a continuum. (e.g. opinion/discussion items )
    •  
  • 11.
    •  
    • Speed tests : easy items that must be answered in a very short time. They assess speed of performance and strategy, e.g. scanning exercises.
    • Power tests : the difficulty of the items demands enough time to respond. They assess actual control over the aspects under scrutiny.
    • Knowledge tests : they assess the language components, e.g. grammar quizzes.
    • Skill tests : they focus on listening, speaking, reading and/or writing. e.g. listening quizzes.
    •  
  • 12.  
  • 13. I. Specific guidelines : The way the test is designed and organized. II. Moderation of mark scheme : The way in which teachers set the score of the test. III. Standardization of examiners : The way in which examiners guarantee a common criteria for correction.
  • 14.
    • I. Specific Guidelines
    • Moderation of tasks : Searching for feed-back. Revision made by other teachers.
    • Level of difficulty : The presentation of tasks in a test should be arranged from easy to difficult. Starting with the most difficult task will lead the weakest learners to soon give up. An item is easy if 75% of students answer it correctly, it’s average if 50% of the students answer it correctly, and if 25% of students can’t answer the item, then it is considered difficult (pilot test).
    • Discrimination : A test should allow candidates at different levels to perform according to their abilities. A variety of tasks ranging from easy to difficult should point out the difference(s) between learners (good and weak). The number of difficult tasks should be limited and go at the end of the test.
  • 15.
    • Appropriate sample : The test should present a representative sample of the objectives, activities and tasks taught or used in the classroom.
    • Overlap : It occurs when content is assessed more than once. It should be avoided as reassessment of content will present an inappropriate sample, but also to prevent visual and mental overload from students.
    • Clarity of tasks : Instructions should be simple and unambiguous, providing a clear indication of what the task demands from the student. Instructions should never be more difficult than the task.
    • Questions and texts: The selection of questions and texts will depend on the purpose and the formats chosen by the designer of the test. Again, the difficulty should not lie in the question but in the task. Conversely, questions should not be too simple, obvious or answerable from world knowledge.
  • 16.
    • Timing : give students a reasonable time to complete the test, since too little time will evidence unreliable results. Students should be aware of the time set to complete each part of the test. The time of the test should reflect the importance and difficulty of what is being assessed. Teachers can pilot the test with a group of a similar level or he/she can even relate to similar evaluative experiences in the classroom, to determine the appropriate time agreed to complete the test.
    • Layout : presentation, printing, spacing, font size, style, formats (a,b,c… I,II,III,IV… 1,2,3…) The layout should be consistent. Single parts should be arranged on the same page.
    • Bias : it can result from experiential, cultural or knowledge-based factors. Avoid items or topics inclined to give an unfair advantage to a particular group of students. Also avoid tasks or issues so obscure that candidates might have no frame of reference into which process and comprehend what is being asked.
  • 17.
    • II. Moderation of Mark Scheme
    • Acceptable response/variations .
    • Subjectivity in productive tasks .
    • Weighting (balance between items/tasks and scores).
    • Computation : The data and results should be easy to compute. The manipulation of numbers must be convenient. Simple for students and teacher (to conceive and process).
    • Avoidance of muddied measurement : The use of a skill should not interfere with the measurement of another.
    • Accessibility/intelligibility of mark scheme : Easy and convenient to access, use and understand.
  • 18.
    • III. Standardization of examiners
    • Agreement on criteria : by teachers and students.
    • Trial assessment : to assess difficulty and potential problems.
    • Review procedures : to make sure they fit test pusposes.
    • Follow-up checks : Notes or reports on the results of the tests (to improve or consolidate it)
  • 19.  
  • 20.