Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Towards Standardization: Designing Exit Tests for Levels


Published on

Presentation at TESOL 2010 convention, Boston, MA, USA

  • Be the first to comment

Towards Standardization: Designing Exit Tests for Levels

  1. 1. Towards Standardization: Designing Exit Tests for Levels Juanita Huan Zhou York University English Language Institute Toronto, Canada March, 2010 1
  2. 2. Outline • The Project • Objectives • Progress • Challenges • Sample: Level 2 Exit Test • Plan of Action • Caveats • References • Acknowledgement 2
  3. 3. The Project • To design exit tests for all levels in a 7-level EAP program • Criterion-referenced achievement tests to be administered in the final week of an 8-week session • International students with varied backgrounds who plan to pursue undergraduate and/or graduate study in a North-American university • Relatively high-stakes decision with an impact on students’ finances, eligibility for continuing in YUELI program, timetable for starting university, candidacy for scholarships, etc. • Test developers=test writers=full-time ESL teachers with extensive experience teaching the levels for which they were asked to design exit tests=test administrators=raters 3
  4. 4. Objectives • To identify students who are unready for the next level in specific skill areas at the end of an eight-week session • To ensure that uniform standards are applied across sections and sessions when such decisions are made • To guarantee fairness to students and teachers alike 4
  5. 5. Progress • Levels and skill areas for which exit tests have been developed: Level 1: Grammar & Writing Level 2: Grammar & Writing Level 3: Grammar & Writing Level 4: Grammar & Reading Level 5b: Grammar, Writing, Reading, & Listening • Time line of the piloted exit tests: Level 1: Jan/Feb, 2008 Level 2: Jul/Aug, 2008 Level 3: Jul/Aug, 2008 Level 4: December, 2008 Level 5b: October, 2009 5
  6. 6. Challenges of Designing a Useful Test • Reliability • Face Validity • Construct Validity • Content Validity • Authenticity • Response Validity • Interactiveness • Concurrent Validity • Impact • Predictive Validity • Practicality • Construct Validity (Bachman & Palmer, 1996) • Intra-Rater Reliability • Inter-Rater Reliability (Alderson, Clapham, & Wall, 1995) 6
  7. 7. Questions to Ask • Does the test test what it is supposed to test? • Does the language being tested reflect the usage of language in the TLU domain? • Does the test produce the same results regardless of when or where students take it; which version they take; which rater rates them if there are several raters; or when a rater rates them if there is just one rater? • Does the test score mean what we want it to mean? 7
  8. 8. Challenges of the Project • Limited resources in funding & time allotment: Level 1: 20 over-time, contact hours Level 2: 30 over-time, contact hours Level 3: 20 over-time, contact hours (10 contact hours each for a team of two) Level 4: 21 over-time, contact hours (7 contact hours each for a team of three) Level 5b: 52 contact hours in place of two seminars=teaching off-loads over two sessions 8
  9. 9. Sample: Level 2 Exit Test Syllabus-based constructs to be measured: the abilities • to use correct syntax and word form; • to use simple present, present continuous, simple past, simple future, and “going to” future appropriately; • to use correct verb form after helping verbs do, does, did; modal verbs can, should, must, will; and phrasal modal have to; • to use gerund or infinitive after want, need, like, prefer, and enjoy; • to form questions in aforementioned verb tenses; • to write compound sentences using and, but, or and so with correct punctuation; • to write complex sentences (adverb clauses) using because, when, and if (first conditional only) with correct punctuation; • to edit sentence fragments, comma splices, and run-on sentences; • to write a logical paragraph expressing personal opinion: write a topic sentence with specific controlling idea(s), support the topic sentence with 2-3 main points and sufficient details, and create a concluding sentence. 9
  10. 10. Sample: Level 2 Exit Test • Part One: Objective test; 60 minutes; cut-off point 65% Why Toronto is Right Task Identifying Parts of Speech 15 Great for ESL or 15% One in a Paragraph items Students Wrong Getting to Know You: Task Supplying Missing 10 Partial A Teacher-Student 25% Two Questions in an Interview items Credit Interview Gap-Filling with Correct Credit Cards, Debit Right Task Verb Tense and Verb Form Cards, or Cash? A 33 or 33% Three of Given Verbs in a Dialogue Between items Wrong Dialogue Two ESL Students Right Task 27 Editing a Paragraph My Trip to Canada or 27% Four items Wrong 10
  11. 11. Sample: Level 2 Exit Test Part One Task One What parts of speech are the words in italics and boldtype--- nouns (n.), verbs (v.), adjectives (adj.), or adverbs (adv.)? Write the answers above the words in italics and boldtype . (15 points: 1 point each) v. adv. Example: He spoke to me angrily. People from all over the world immigrate1 to Toronto, so ESL students can experience2 many different3 cultures in this city. They can taste4 foods from almost5 any country in the world. They may not like the taste6, but it will be an interesting7 experience8. 11
  12. 12. Sample: Level 2 Exit Test Task Two What are the missing questions in the following interview between a YUELI teacher and a new student? (25 points: 1.5 points each) Example: Teacher: What’s your name? Student: My name is Ken Tanaka. (1) Teacher: ____________________________? Student: I came to Toronto two months ago. (2) Teacher: _____________________________? Student: I studied at a private language school before I came to YUELI. (3) Teacher: ______________________________? Student: Because there were too many Japanese students at that school. (4) Teacher: ______________________________? Student: My friend told me about YUELI. 12
  13. 13. Sample: Level 2 Exit Test Task Three Fill in the blanks with the verbs in parentheses. Choose correct verb tense: simple present, present continuous, simple past, simple future, or “be going to” future. Use correct verb form (base form, -s, -ing, or -ed), gerund, or infinitive. (33 points: 1 point for each blank) Example: Minjae: _____ you _____ (have) a good time? Answer: Are you having a good time? Minjae: Thank God it’s Friday! What ________1 you ________2 (do) this weekend, Ibrahim? Ibrahim: I don’t know. I haven’t decided yet. Maybe I ________3 (go) shopping. What about you? Minjae: I wish I could, too, but I don’t have any money left. I _______4 (wait) for my father to send me some. 13
  14. 14. Sample: Level 2 Exit Test Task Four Correct the 27 mistakes in the following paragraphs. Write the corrections above the mistakes. The mistakes include: no subject in the sentence; no verb in the sentence; wrong verb tense; wrong verb form; wrong past tense of irregular verbs; repeating subject in the sentence; wrong punctuation or no punctuation; wrong capitalization. (27 points: 1 point per mistake) Example: Is the capital of Korea. Correction: Seoul is the capital of Korea. When I leaved my country for Canada. My mother and father crying. I wanted stay, but I goodbye to my parents and spended 14 hours on the plane. When I arrived in Toronto. I been happy. Because was a lonely trip. My uncle meets me at the airport. We were go to his house, the house very big. 14
  15. 15. Sample: Level 2 Exit Test 7 nouns (article+adj+n; subject+v+plural n; there be+n; possessive adj+n; demonstrative+n; subject+be+plural n); 4 verbs (subject+v; modal+base form; to Parts of +base form); 2 adjectives: (subject+be+adj; article+adj+n); 2 adverbs (ending in - Speech ly; adv of frequency) 4 words used twice, each time as a different part of speech; 2 words each from three word families. 7 information questions (when, where, why, what, who, how, how long); 3 yes-no Question questions. Verb tenses incl. simple present, main verb=be (2); simple present, main Formation verb≠be (1); present continuous (1); simple past (4); simple or “going to” future (2) Verb tenses incl. simple present (9); present continuous (2); simple past (6); “be going to” future (1); simple future (3). Verb forms incl. base form (7); 3rd person Verb Tenses singular (4); modal (2); continuous form (2); past tense, regular (2); past tense, and Verb irregular (4). Gerund or infinitive: (1) Forms Sentence types: negative (4); interrogative (1); imperative (1); first conditional (1); there be (1) Run-on sentence (1); comma splice (1); fragment, DC as IC (3); fragment, no Error subject (1); fragment, no verb (3); repeating subject (1); wrong verb tense (2); Corrections wrong verb form (6); wrong past tense of irregular verbs (3); capitalization mistakes (3); punctuation mistakes (2); wrong co-ordinating conjunction (1) 15
  16. 16. Sample: Level 2 Exit Test • Part Two: Subjective test; 90 minutes; passing grade 2.5, or 65%-69% Task 333 words Holistic Dictogloss How I Learned English 50% One in 2’10” Scale Why is English difficult to learn? Choose What is the best age for Task one of Holistic learning English? 250 words 50% Two three Scale Compare studying English in prompts your country and studying English in Canada. 16
  17. 17. Sample: Level 2 Exit Test Part Two Task One Listen to the story three times. After listening, rewrite the story. It is not a dictation, so you don’t need to use the same words or the same sentence structure. You can use your own words and make your own sentences. The most important thing is to tell the story clearly and completely. You should write about 200 words. Transcript People often ask me, “When did you start to learn English?” I really don’t know how to answer them. I first started to listen to English recordings when I was six, but I wouldn’t say that was when I started to learn English. You see, my father would record BBC English programs from the radio and then record them again with classical music playing in the background. He told me if I listened to these recordings every morning for 30 minutes, he would pay me 50 cents a day. 17
  18. 18. Sample: Monitoring Part One of Level 2 Exit Test Statistically Population Students Classes Teachers Sessions Statistics 07/2008-08/2009 181 15 8 7 Passing Rate Samples Students Classes Teachers Sessions Statistics from Each Class One 95 8 5 5 Mean, Range, Passing Rate Mean Range, Passing Rate, Two 51 4 4 2 Task Mean, Task Range Mean, Range, Passing Rate, Three 44 4 1 4 Median, Mode, Standard Deviation Percentage of Students Who Passed Level 3 on Four 116 15 8 7 the First, Second, and Third Attempt 18
  19. 19. Sample Statistics of Part One of Level 2 Exit Test 07/2008-08/2009 • Sample One: 95 Students Mean: 74.6% Test Passing Rate: 83.3% • Sample Two: 51 Students Task Mean: 73.3% (One); 82.1% (Two); 83.6% (Three); 61.8% (Four) • Sample Three: 44 Students Standard Deviation: 10.9 • Entire Population: 181 Students Level Passing Rate: 84.0% (excluding Unable to Evaluate) • Sample Four: 116 Students Promoted to Level 3 Percentage of Students Who Passed Level 3 on the First Attempt: 81.9% 19
  20. 20. Sample: Monitoring Level 2 Exit Test Empirically • Unanticipated answers from test takers in Part One • Time allotment for Part One • Comparison between Exit Test Part One and Final Grammar Test • Comparison between Exit Test Part One and teachers’ continuous assessment • Problems of Task One in Part Two • Writing prompts for Task Two in Part Two • Security of Part Two • Marking rubric for Part Two • Issues concerning rater reliability 20
  21. 21. Sample: Level 2 Exit Test Revisions Original Revised Part One Length 60 min 90 min Task One # of Items 15 20 Weighting 15% 24.2% Specifications No changes for nouns and verbs; addition of 3 adj and 2 adv; total of 5 adj and 4 adv Marking No changes (One point each; no partial credit) Rubric Task Two # of Items No changes (10 total) Weighting 25% 18.2% Specifications No changes 2.5 points each: 3 points each: Students lose all points for 1 point for logic; wrong logic, wrong question formation, 1 point for or wrong helping verb; otherwise, dock Marking question 1 point for wrong verb tense, and dock Rubric formation; 0.5 point for other mistakes such as 0.5 point for verb wrong word order, wrong preposition, tense wrong verb form and missing article. 21
  22. 22. Sample: Level 2 Exit Test Revisions Original Revised Task Three # of Items 33 37 Weighting 33% 22.4% Deleted 1 base form; Specifications Added 3 infinitives, 1 gerund, and 1 verb “be” in simple present Marking Rubric No changes (One point each; no partial credit) Task Four # of Items 27 28 Weighting 27% 35.2% Specifications No changes 2 points each; 1 point each: If students identify what is correct as no partial credit; wrong, they do not lose points if they Marking Rubric no penalty for supply a correct alternative; however, correcting what is they lose 2 points if they supply a correct wrong version 22
  23. 23. Sample: Level 2 Exit Test Revisions Original Revised Part Two Length No Change Marking Holistic Analytic Rubric Reading Task One 333 words in 2’10” 333 words in 2’30” Speed Weighting 50% 25% One assigned topic rotated Task Two Prompts Choose one from three prompts every three sessions (1) Why is English difficult to (1) Which is better, travelling learn? alone or travelling with (2) What is the best age for friends? Topics learning English? (2) Is money a good gift? (3) Compare studying English in (3) Compare studying English your country and studying in your country and studying English in Canada. English in Canada. Weighting 50% 75% 23
  24. 24. Sample: Monitoring Revised Level 2 Exit Test Part One Statistically Population Students Classes Teachers Sessions Statistics Level Passing Rate, Test Passing Rate, Mean, 09/2009-02/2010 102 8 4 3 Range, Task Mean, Task Range Samples Students Classes Teachers Sessions Statistics from Each Class Mean, Range, Passing Rate, Task One 59 5 3 3 Mean, Task Range, Median, Mode, Standard Deviation Mean, Range, Passing Rate, Task Mean, Task Range, Median, Mode, Two 32 3 3 1 Standard Deviation, Item Facility, B- Index Percentage of Students Who Passed Three 35 8 4 3 Level 3 on the First, Second, and Third Attempt 24
  25. 25. Comparison of Statistics of Part One between the Original and the Revised Version Original Revised Mean 74.6% (95 Ss) 66.5% (102 Ss) Standard Deviation 10.9 (44 Ss) 10.7 (59 Ss) Task Mean (One) 73.3% (51 Ss) 70.3% (102 Ss) Task Mean (Two) 82.1% (51 Ss) 75.8% (102 Ss) Task Mean (Three) 83.6% (51 Ss) 78.3% (102 Ss) Task Mean (Four) 61.8% (51 Ss) 50.5% (102 Ss) Test Passing Rate 83.3% (95 Ss) 61.7% (102 Ss) Level Passing Rate 84.0% (181 Ss) 70.0% (100 Ss) First Attempt of Level 3 Successful 81.9% (116 Ss) 88.6% (35 Ss) 25
  26. 26. Plan of Action • Monitor the usefulness of the exit tests by collecting data systematically and calculating the statistics: Mean, mode, median, standard deviation, range, highest, lowest, task mean, task range, task highest, task lowest, item facility, B-index, agreement statistic, item phi, test passing rate, & level passing rate • Compare students’ performance on the exit tests and the next level’s diagnostic tests • Revise the exit tests by re-examining items with a facility index less than 0.40 or an average less than 40% of the full score and items with a negative B-index • Write multiple versions of the exit tests • Implement rater training and monitoring • Improve on the reporting of scores • Use the exit tests to evaluate the curriculum by calculating the difference index through an intervention study (Brown & Hudson, 2002) 26
  27. 27. Caveats • Exit tests should not be used as the sole determining factor in making level decisions • Exit tests should not be used as a tool to evaluate teachers 27
  28. 28. References Alderson, J. C., Clapham, C., & Wall, D. (1995). Language test construction and evaluation. Cambridge: Cambridge University Press. Bachman, L. F. (2004). Statistical analyses for language assessment. Cambridge: Cambridge University Press. Bachman, L. F. & Palmer, A. S. (1996). Language testing in practice: Designing and developing useful language tests. Oxford: Oxford University Press. Brown, H. D. (2004). Language assessment: Principles and classroom practices. White Plains, NY: Pearson Education, Inc. ---. (2001). Part V. “Assessing language skills.” In Teaching by principles: An interactive approach to language pedagogy. 2nd ed. White Plains, NY: Pearson Education, Inc. ---. (1994). Chapter 10, “Language Testing.” In Principles of language learning and teaching. 3rd Ed. Englewood Cliffs, New Jersey: Prentice Hall Regents. Brown, J. D. & Hudson, T. (2002). Criterion-referenced language testing. Cambridge: Cambridge University Press. Fox, J., Wesche, M., Bayliss, D., et al. (Eds.) (2007). Language testing reconsidered. Ottawa, ON, Canada: University of Ottawa Press. Fulcher, G. & Davidson, F. (2006). Language testing and assessment : An advanced resource book. Florence, KY, USA: Routledge. 28
  29. 29. Acknowledgement Many thanks to my colleagues who developed the Level 3, 4, and 5b exit tests respectively for sharing their products and thoughts with me: • Dayna Aguilera • Jerry Carson • Henry Park • Gary Simon • Farbod Eskandiri Many thanks to Paul Elo, the YUELI technician, for the statistics of Level 2 passing rates and Level 2 graduates’ Level 3 attempts. 29