• Save
Language Testing and Assessment
Upcoming SlideShare
Loading in...5

Language Testing and Assessment






Total Views
Views on SlideShare
Embed Views



0 Embeds 0

No embeds



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment
  • Any writer or designer will tell you that 90% of the creative process…
  • …is destructive.
  • Cloze – integrate all (grammar, vocabulary, context use …) thus, more economical and efficientDirect test – is to Costly
  • conducive |kənˈdjuːsɪv|adjective (usu. conducive to)making a certain situation or outcome likely or possible:
  • Confer (v) discuss

Language Testing and Assessment Language Testing and Assessment Presentation Transcript

  • ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស Workshop on ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស ស Language Testing and Assessment Date: 28 – 29 January 2014 Presented by Soeung Sopha 0
  • Language Testing ​ Presented by Mr. SOEUNG SOPHA Contact: (855) 81 702 123 Email: aep.manager@wegcambodia.com 1
  • Aim: To present the trainees with types and purposes of test and how to write a good test Objectives: By the end of the training, trainees will be able to: - Design a better test for their teaching career. - Critique their previous tests - Use the correct type of test with its purposes 2
  • Testing and YOU What is the exact nature of your job? e.g.  What & where do you TEACH?  Do you have duties BEYOND teaching? (e.g. supervise other teachers) 3
  •  How are you involved with tests? e.g. I prepare students for tests (Say what test, e.g. TOEFL, IELTS etc)  I have to make tests  I have to check and score tests  I have to plan or supervise tests  I have to take IELTS test   WHY is STUDYING about LANGUAGE TESTING IMPORTANT to you? 4
  • To start thinking about… WHAT is a TEST? WHY do we need to understand TESTS & LANGUAGE TESTING better? 5
  • What word you first see… What has LOVE got to do with TESTING? 6
  • Test Functions or Properties 1- What can a test decide about the test-takers’ destiny? 2- Who are the gatekeepers & who set the criteria? 3- What is the test’s original gate-keeping function? 1- What does a test seek to find out? 2- What can a test find out? 3- Do the answers to these two questions match? Finding-out GateGate-Keeping Keeping Washback Effect Functions 1- What does the test ‘say’ to people? 2- How important is the test for the test-takers? 3- How highly regarded is the test? 7
  •  “I didn’t fail the test, I just found 100 ways to do it wrong.” – Benjamin Franklin  “Experience is a hard teacher because she gives the test first, the lesson afterward.” – Vernon Law 8
  • Why do we need to understand language testing? Tests concern all of us 9
  • As Test-takers: Understanding the test helps us know what we have achieved, & what we have not.  Understanding the test helps us do better & get what we need or want.  Understanding the test helps us know how much our learning should depend on the test.  10
  • As Teachers: What does a test really tell us about our learners? How can we find out what we want to find out?  Where do our learners want to go? How can we help them get there?  How useful is the test for our teaching objectives? To what extent should we teach to the test? How can we use tests to drive the learning we desire?  11
  • As Test-makers: Do our tests really find out what we want to find out?  Are our tests fair gate-keepers?  Do our tests help to achieve teaching goals?  12
  • 1- Purposes of Language Testing We can classify tests according to their “Everybody is a genius. if you FUNCTIONS ability Butclimb a or PURPOSES judge a fish by its to tree, it will live its whole life believing that it is stupid” – Albert Einstein 13
  • 14 Warm-up Exercise: What are the differences among the following tests?  TOEFL, IELTS or a Job Interview in English  An English examination in the final year of secondary school  A classroom test set by the teacher, possibly at the beginning of course or school year  A test set by a language centre for students who want to enroll for an English course, but have not started yet.  A test given to all students in a course, but whose results are not made known to them
  • 15 PROFICIENCY vs. ACHIEVEMENT  PROFICIENCY TESTS decide whether the person’s language is good enough for some future purpose.  Usually high stakes gate-keeping – e.g. getting a job, immigration, admission into university  Not tied to any course that the test-taker is studying, BUT  Many ‘outside’ courses to help test-takers  Tend to be conservative (e.g. don’t change often), standardized
  • 16 PROFICIENCY vs. ACHIEVEMENT (con’t.)  ACHIEVEMENT TESTS focus on how much the students have learned what they are supposed to learn.  Some are high stake (e.g. graduation exams), others less (e.g. mid-course test)  Tied to a course the student is taking  ‘Extra’ help often within the course  More open to changes (e.g. alternative assessment modes, such as portfolio, self-assessment)
  • 17 DIAGNOSTIC & PLACEMENT  DIAGNOSTIC TESTS aim to find out a learner’s strengths & weaknesses, so that the teacher knows what to do.  Many ACHIEVEMENT tests have DIAGNOSTIC aims, but not always (e.g. graduation exam)  Many DIAGNOSTIC tests are ACHIEVEMENT tests, but not always (e.g. pre-test at beginning of a course
  • 18 DIAGNOSTIC & PLACEMENT  PLACEMENT TESTS are used to decide where (e.g. in which class) to put a student.  PROFICIENCY tests are often used for PLACEMENT, but not always (e.g. TOEFL for admission to university)  PLACEMENT tests are usually PROFICIENCY-type tests, but ACHIEVEMENT tests can be used for placement (e.g. school results)
  • 19 Aptitude & Progress  Aptitude Test: To predict a person’s future success in learning a (any) foreign language  Taken before actual learning  Progress tests: to assess students’ mastery of the course material (during the course)
  • 20 REFLECTIONS What have you realized about:  WHY understanding language testing is important?  WHAT you must remember about any test you are connected with? Write down THREE points, and share them with your group.
  • Every single thing has its own beauty.
  • 23 2- TEST CONSTRUCT  Every TEST CONSTRUCT is based on a theory of … What does it mean to know a LANGUAGE?
  • “Fundamental to the preparation of valid tests of language proficiency is the theoretical question of what does it mean to know a language?” Spolsky 1968 24
  • 25 1 Teaching Philosophy 2 Teaching Approach 3 Test Design
  • General Classification 26 Direct VS. Indirect Test Candidates are required to perform the skill the test intends to measure. Measures skills that underlie performance in a particular task.
  • Specific Types of Construct 27 Discrete-point VS. Integrative Every item focuses on one clear-cut segment of target language without involving the others. E.g. multiple choice test Candidates need to use a number of language elements at the same time in completing the test tasks. E.g. essay writing, reading comprehension, cloze test…
  • 28 Communicative VS. Performance-based Communicative Performance-based Use real language in real situation Requires application of learning in actual situation Attention to social role Attention to social role Candidate in extended act of communication Often aims at selecting candidate for a job
  • 29 Discussion: In your group:  Share ONE test with which you are involved. SHOW it to the group if you can.  Identify the test CONSTRUCT(S) used.
  • 30 Analysis & Critique ANALYSIS & CRITIQUE:  What THEORY of language learning does the test reflect?  Is the test consistent with the philosophy or theory underlying the course with which it is connected?  Is the test consistent with YOUR theory of language learning?
  • 31 Implication & Application  If you can, how will you CHANGE the TEST to make it consistent:  with the course philosophy?  with your own theoretical beliefs?  If you cannot change the test, how should you ADAPT your TEACHING to accommodate the test? What should you do?
  • 3- What Makes A Good Test? 32
  • 33
  • Two Fundamental Criteria 34
  • 35 Introduction Other important considerations: Is it PRACTICAL?  Is it easy & economical to administer? Is it DISCRIMINATING?  Crucial for NORM-REFERENCED tests  Can the results show clearly who the better candidates are?
  • 36 1- VALIDITY: basics  Concerns whether the test is appropriate to the PURPOSE or FUNCTION  Related to THEORETICAL ASSUMPTIONS about Language & Learning  About TESTING WHAT YOU TEACH  Reflected in  Test CONSTRUCT  Test CONTENT
  • 37 Aspects of VALIDITY  CONSTRUCT VALIDITY  What test CONSTRUCTS are used?  Are they consistent with:  The philosophy of the teaching programme (Achievement Tests)?  A good theory of language and learning (Proficiency Tests)?
  • 38 Aspects of Validity  CONTENT VALIDITY  What do the items actually cover?  Grammatical items (Structural Syllabus)  Subject/Topics in Texts (e.g. football, shopping, nuclear war)  Have these items been taught?  Can the test-takers be expected to be familiar with them if not taught? (esp. Subjects/Topics)
  • 39 Aspects of VALIDITY  CRITERION-RELATED VALIDITY  Concurrent Validity  Does the test really test what the learners know now?  Predictive Validity  Does the test really test what we expect the test-taker to be able to do in future?
  • 40 Aspects of VALIDITY  SCORING VALIDITY  Is the test marked for the right things?  FACE VALIDITY  Does the test look like a test?
  • 41 VALIDITY: Group Discussion  Select a test you are familiar with, and discuss its VALIDITY in relation to the different aspects explored.  Suggest how the test can be made more valid.  Suggest some guidelines for making a test more valid.
  • 42 Making tests more VALID  Table of SPECIFICATIONS spelling out:  the test CONSTRUCT(S)  the CONTENT COVERAGE necessary
  • 43 Test Construct must relevant to PURPOSE of test:  e.g.  Achievement Tests must reflect Syllabus Approach. * Discrete-point grammar test for a grammatical/structural syllabus, communicative testtasks for communicative syllabus  Proficiency Tests must reflect real-world abilities & skills expected of candidates . * Academic listening/reading/writing tasks for proficiency test for university admission
  • 44 Making tests more VALID  CONTENT coverage necessary:  Achievement  the Tests full range of content taught, and  only the content taught (e.g. for grammatical syllabus, the no. of questions for each grammar item taught)
  • 45 Making tests more VALID  SCORING CRITERIA relevant to PURPOSE to be spelt out:  What to mark for  What not to mark for
  • 46 2- RELIABILITY: basics  About whether the scores can be trusted to reliability – consistency:  Same/similar student(s) taking same/similar test get similar scores  Different raters/markers give same scores  Same rater/marker gives same score at two different times  Key
  • What Affect Reliability 47
  • 48 RELIABILITY: Group discussion  What problems related to the following factors might cause a test score to be unreliable? The CONDITIONS under which the candidates are tested  The TEST paper itself  The CANDIDATES  The SCORING procedure   Based on the problems identified, give a list of suggestions on how to make a test more reliable
  • Increasing RELIABILITY: Some suggestions  Create or ensure CONDITIONS for test-taking that are:  UNIFORM for all candidates  NON-DISTRACTING  COMFORTABLE & CONDUCIVE  ANXIETY- & STRESS-REDUCING 49
  • 50 Increasing RELIABILITY: Some suggestions  For the TEST PAPER:  Give enough questions  Start with easy questions  Do not allow too much freedom  Write unambiguous items  Make sure the instructions are clear & explicit  Ensure good layout & legibility  Try out the test paper on trusted colleagues
  • Increasing RELIABILITY: Some suggestions  For the CANDIDATE:  Familiarize the candidates with the test format and testing technique  Put the candidate at ease before starting  Help the candidate as much as possible within the rules (e.g. give time checks)  Create the right test conditions 51
  • 52 Increasing RELIABILITY: Some suggestions  For SCORING DIRECT TESTS:  Provide a detailed scoring key (criteria with band descriptors)  Train scorers  Agree acceptable responses & appropriate scores before starting  Employ multiple or inter-rater scoring  Hold periodic scorer meetings  Identify candidates by number, not name
  • 54 1- Defining the Test „Problem‟  Be clear what the PURPOSE of the test is: e.g. achievement test for Writing 1 course, oral proficiency test for tour guides  Take note of the PRACTICAL CONSTRAINTS for the test:  TIME allocated for test  MANPOWER needed (esp. scorers)
  • 55 2- Deciding the Test Specifications  Make a list of the CONTENTS necessary to make the test valid  Choose TEST TECHNIQUE(S) based on:  CONSTRUCT validity  PRACTICAL constraints  Decide ALLOCATION on principled grounds:  WHICH items to test more  WHAT %tage of the total score for each item
  • 56 3- Constructing the Test  SELECT material according to CONTENT in specifications (e.g. reading passage based on topic taught)  WRITE the test items based on the specifications  PREPARE the scoring guide (i.e. answer key + marking scheme, marking descriptors)
  • 57 4- Testing & Re-writing the Test  In BIG tests (e.g. national exams, IELTS, TOEFL): Pilot testing with ‘experts, sample candidature (tests of concurrent validity etc)  In smaller institutional settings: let OTHER teachers DO the test to discover problems with:  instructions  difficulty of items  unreliability of particular questions
  • 58 5- Giving the Test  Ensure the best conditions for students to increase RELIABILITY  Where relevant & possible, take note of HOW students perform the test & factors affecting performance  REPORT observations  USE observations to improve future tests
  • 59 6- Scoring the Test  Provide as COMPREHENSIVE & DETAILED a guide as possible (e.g. band descriptors may specify how many errors is acceptable)  Note PROBLEMS & PATTERNS arising during scoring  Scorers should CONFER often  ALTER scoring scheme where necessary
  • 60 7- Identifying Problems with the Test  Important for FAIRNESS & for IMPROVEMENT of the test  Based on OBSERVATIONS during test administration & scoring  System for REPORTING problems  Important for MODERATING scores
  • 61 8- Moderating the Scores  Adjust scores to fit expected or acceptable patterns  Important for reasons of FAIRNESS – related to issue of RELIABILITY  May  Can be based on PAST records of institution be done STATISTICALLY
  • 62 Discussion  To what extent is this carried out in YOUR situation, and for which tests?  How can you apply some PRINCIPLES in YOUR situation?  What RECOMMENDATIONS might you make to your institution to improve their TEST PROCEDURES?
  • Test Specification 63
  • Table of Specification 64
  • Table of Specification 65
  • 66 Think of the choices I- 254 + 18 = __________ A- 434 B- 262 C- 272 II- 254 + 18 = __________ A- 722 B- 227 C- 272