Article presentation ii


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Article presentation ii

  1. 1. ETSGuidelines for the assessment of English language learners<br />ESL 501<br />Article Presentation<br />Summer Schoenberg<br />
  2. 2. Key Terms<br />Construct – the skill or proficiency an assessment is intended to measure<br />Response – any type of performance to be evaluated as part of an assessment<br />Task – a specific test item, topic, problem, question, prompt, or assignment<br />Testing accommodation – any change to standardized testing conditions intended to make the test more fair and accessible for an individual or subgroup that does not change the construct being measured<br />
  3. 3. LEP<br />Section 9101 of Title IX states that an LEP student:<br />Is between the ages of 3 and 21<br />Is enrolled or preparing to enroll in an elementary or secondary school<br />Has one of these three profiles:<br />Not born in the US or speaks a native language other than English<br />Native American or Alaska native that comes from an environment where a language other than English has had a significant impact on his or her level of English language proficiency<br />Is migratory, has a native language other than English, and comes from an environment where a language other than English is dominant<br />
  4. 4. Has difficulties in speaking, reading, writing, or understanding the English language that are so severe as to deny the individual one of the following:<br />The ability to meet the state’s proficient level of achievement on state assessments described in section 1111(b)(3) of the BCLB Act<br />The ability to successfully achieve in classrooms where the language of instruction is English<br />The opportunity to participate fully in society<br />
  5. 5. Factors Influencing the Assessment of ELLs<br />Different Linguistic backgrounds<br />Varying levels of proficiency in English<br />Varying levels of proficiency in native language<br />Varying degrees of formal schooling in native language<br />Varying degrees of formal schooling in English<br />Varying degrees of exposure to standardized testing <br />Varying degrees of acculturation to US mainstream<br />
  6. 6. Steps within the Planning Process<br />Test purpose – must be clear in order for valid interpretations to be made on the basis of test scores<br />Defining the Instruction – definition of what the test is intended to measure<br />Developing the assessment Specifications – define the content and explain how it will be assessed<br />The state will provide content and performance standards<br />Tests with more items will supply more reliable scores<br />ELLs should have multiple ways to show what they know<br />Assessments should include a variety of item and response types<br />
  7. 7. The weight of a task is decided by the importance of the assessed task in relation to other tasks<br />Assessment specifications will describe how tasks will be presented and how students are then expected to respond<br />Diagrams, or tables will help ELLs to better demonstrate what they know<br />Each educational agency should be able to provide information about the cultural backgrounds of its test taking population<br />Test material should include references to major groups in the tested population and be aware of cultural diversity in readings and illustrations<br />
  8. 8. Developing Test Items and Scoring Criteria<br />Test items should link content and skill that the item is supposed to measure<br />All test items should maintain specificity in their match to content guidelines<br />Material should be appropriate and accessible to examinees and that it only measures the intended construct<br />Item writers should not assume that students have had any previous experience with given tasks<br />The criteria for the evaluation should also be made clear to the student<br />
  9. 9. Language should be simplified in directions and given orally or in a language other than English if that will provide for the best understanding of ELLs<br />Clear and accessible language should be used<br />Colloquialisms and idiomatic expressions should be avoided<br />Sentence structures should be kept simple<br />The use of negatives should be avoided<br />Simple context should be used in regards to fictional context<br />Presentation should clear and consistent<br />ETS Standards for Quality and Fairness should be applied<br />
  10. 10. External Reviews of Test Materials<br />Reviews offer an effective technique to improve the quality of assessments<br />Reviewers are often chosen for their knowledge of the ELL population and specific challenges they may face<br />Materials must pass committee reviews and should include content experts and professionals who are familiar with issues regarding different ELL populations<br />
  11. 11. Some examples of questions that should be addressed by reviewers:<br />Does each task match the purpose of the assessment and the assessment specifications?<br />Are the formats of both the assessment and the response materials appropriate?<br />Do the tasks and scoring criteria meet standards for fairness?<br />
  12. 12. Evaluating the Tasks Through Tryouts<br />Useful information is provided through field testing items<br />Data collected during these tryouts is used to:<br />Inform content and fairness reviews of the items,<br />Evaluate the clarity of instructions to examinees, <br />Assess whether ELLs of different proficiency levels can understand the text of the items<br />
  13. 13. Types of Item Tryouts can include:<br />
  14. 14. One-on-One Interviews<br />Used to directly obtain information from students about their thought processes while answering the items<br />Also used to obtain information a bout their understanding of complex language <br />
  15. 15. Small-Scale Pilot Tests<br />Used to figure out how items will work and how students will respond to items<br />
  16. 16. Large-Scale Pilot Tests<br />Used to obtain reliable and valid statistics that can be used when selecting items for test forms<br />Statistics based on these responses are generally accurate indicators<br />
  17. 17. Limitations of Item Try-outs<br />Differences in<br />Demographics<br />Curriculum<br />Culture<br />* all three may make comparisons difficult<br />
  18. 18. Scoring<br />Rubrics are constructed after determining which English language skills are required for answering a given item<br />English language skills should play in determining a score<br />Scorers must have an understanding of the language or presentation style examinees use and will receive training of how to interpret responses and the scoring rubric in a linguistically sensitive way<br />
  19. 19. Testing Accommodations for ELLs<br />Equity and validity in assessment in the main purpose in providing examinees with testing accommodations<br />The ETS Guidelines specifically state, “ELLs should have the same opportunity as students who have English as their first language to demonstrate their knowledge or skills in a content area”<br />
  20. 20. Accommodations for ELLscan include changes made to:<br />Presentation of test materials<br />Students’ responses to test items<br />Scheduling<br />Test setting<br />Testing modifications are a bit different and can alter what the assessment measures<br />
  21. 21. Accommodations<br />ELLs currently do not have any type of documentation (like an Individualized Education Plan) that identifies them as being eligible for accommodations<br />However, if the proficiency level of an ELL is low enough to not produce a valid assessment, the student will receive accommodations<br />
  22. 22. Accommodations<br />When students receive accommodations in the classroom, the same accommodations are allowed in testing situations, as long as they are appropriate <br />Students may receive:<br /> Direct linguistic support which makes adjustments to the actual language of a test such as translated texts<br />Indirect linguistic support which involves changes in the testing environment such as extended time<br />
  23. 23. Statistics are used to evaluate assessment and scoring<br />Test Scores need to be reliable and valid<br />Methods of investigating validity include<br />Analysis of internal test structure<br />Relations to other variables/constructs<br />Test speededness<br />These investigations are done by the ETS Standards for Quality and Fairness<br />