Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Eaquals apr 14 barker

488 views

Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Eaquals apr 14 barker

  1. 1. Assessment literacy: What is it and how best to achieve it? Fiona Barker Cambridge English Language Assessment 25 April 2014, Budapest © UCLES 2014
  2. 2. Outline 1. Rationale for this workshop 2. Our experiences of assessment 3. Our view of assessment 4. An exam board’s approach to assessment 5. Impacts of assessment 6. Connecting teaching, learning and assessment 7. Find out more: assessment resources 8. Applying assessment knowledge in your context © UCLES 2014
  3. 3. 1. Rationale • Explore some of the key concepts of assessment that all practitioners should have a working knowledge of in order to teach effectively and to make the most of assessment opportunities. • Think about how our learners would benefit from a better understanding of why and how they are assessed and how to support them to become more autonomous. • We will draw upon existing approaches and our experiences of learning about and practising assessment in a variety of contexts, including language schools and in an exam board. © UCLES 2014
  4. 4. To start us thinking, ask yourself: • What do I know about assessment already? • What would I like to know about assessment? • What do I need to do to get to this level of understanding and be able to apply it? © UCLES 2014
  5. 5. 2. Our experiences of assessment 1. Learning 2. Being 3. Doing © UCLES 2014
  6. 6. 2.1 Learning about assessment 1. How did you learn about assessment in your role as a teacher/trainer/manager etc.? 2. Were you explicitly taught about assessment or expected to pick it up by yourself, or something else happened? © UCLES 2014
  7. 7. 2.2 Being assessed 1. What is your earliest memory of being assessed? 2. What was positive/negative about it for you? 3. Do you have another positive or negative memory of being assessed? © UCLES 2014
  8. 8. 2.3 Doing assessment • What is your top tip for assessing students? © UCLES 2014
  9. 9. 3. Our view of assessment • What is it? • Are there different types? • How can it be done? © UCLES 2014
  10. 10. What is assessment? • What words do you associate with assessment? © UCLES 2014
  11. 11. © UCLES 2014 www.wordle.net
  12. 12. Are there different types? • Name as many different types of assessment as you can • Tests • Quizzes • Portfolios • Recordings • Presentations • etc © UCLES 2014
  13. 13. Aspects of assessment Who? • Assesses: Self / Peer / Teacher / Examiner • Takes part: singleton / pair / group When? • planned or ad hoc • regularly – e.g. every lesson, end of week / topic / module / term / etc. What conditions? • Exam or freer • Timed or not • Support or resources permitted • Online or offline • Face to face or distance Where? • Exam hall / classroom / home / testing centre / other Why? • Many purposes © UCLES 2014
  14. 14. 4. An exam board’s approach • Purpose of assessment • Brief history • Essential test qualities • How to evaluate a test • Topical issues © UCLES 2014
  15. 15. Purpose of assessment • Language assessment aims to measure a hidden trait (i.e. language ability). • This measurement allows us to make inferences about an individual’s language ability. • We make inferences on the basis of an individual’s observed behaviour(s) in the assessment situation, usually on the basis of test scores attached to them – score interpretation. • We need to think about the correspondence between general language use in a non-test context (the target language use situation) and specific performance in a language test. © UCLES 2014
  16. 16. Measuring constructs How? Link hidden trait to observable behaviour Attach test scores to observable behaviour © UCLES 2014
  17. 17. Brief history • Discrete point tests (1960s) Language broken down into components and tested separately, e.g. reading, grammar, phonology • Integrative testing (1970s) Tests tap into several competencies, e.g. writing a composition, cloze test, dictation • Communicative language testing & performance based assessment (1980s onwards) Correspondence between test and non-test situations, e.g. oral/written production, open-ended responses, integrated tasks © UCLES 2014
  18. 18. Example: Measuring Anxiety • What behaviours are associated with anxiety? • How could we measure these behaviours? • We would then attach a level or score to these behaviours to interpret what they mean. © UCLES 2014
  19. 19. Essential test qualities Impact Validity Practicality Reliability Q © UCLES 2014
  20. 20. Validity CONSTRUCT VALIDITY: central concept Extent to which test scores can be considered a true reflection of underlying ability the test is trying to measure Q: Does the test measure what it is supposed to measure? FITNESS FOR PURPOSE Tests/test scores are not ‘valid’, they are VALID FOR A PURPOSE © UCLES 2014
  21. 21. Reliability Extent to which test scores are consistent and accurate, and therefore free of measurement error Q: Is there consistency of measurement? Test reliability and rater reliability • Objectively marked tests, e.g. a multiple choice test statistical measures • Subjectively marked tests, e.g. a writing/speaking test rater reliability Q: How can test reliability be increased? © UCLES 2014
  22. 22. Enhancing test reliability • Take enough samples of behaviour. • Do not allow candidates too much freedom: Write a composition on tourism vs. Write a composition on how the tourist industry in this country might be developed. • Write unambiguous items. • Provide clear and explicit instructions. • Ensure candidate familiarity with format/testing techniques. • Provide uniform administration conditions. • Make scoring as objective as possible. • Train scorers and get multiple, independent scoring. (Hughes, 1989) © UCLES 2014
  23. 23. Impact The effect of a test on test takers, education systems and society more widely: ‘the larger framing and social meaning of assessment’ (McNamara, 2000) • Micro: effect on classrooms (washback) • Macro: effect on society © UCLES 2014
  24. 24. Ensuring positive washback and impact • Test only those abilities whose development you want to encourage and not what is easiest to test. • Give sufficient weight to those abilities. • Sample widely from the non-test domain. • Use direct testing (i.e. performance skills). • Make testing criterion-referenced. • Provide assistance to teachers. © UCLES 2014
  25. 25. Practicality Balancing required resources and available resources © UCLES 2014
  26. 26. Balancing test qualities Validity Reliability Impact Practicality © UCLES 2014
  27. 27. How to evaluate a test Claims Evidence Weakness/strength of claims Building a validity argument © UCLES 2014
  28. 28. Useful framework for evaluating tests Cyril Weir’s (2005) ‘socio-cognitive framework’ for validating language tests a framework of questions about the validity of language tests © UCLES 2014
  29. 29. Weir’s (2005) Socio-cognitive framework 4. How far can we depend on the scores of the test? 2. Are the cognitive processes required to complete the tasks appropriate? 1. How are TT characteristics catered for by this test? 6. What external evidence is there beyond test scores that the test is doing a good job? 3. Are the characteristics of the test tasks and administration fair to TTs? 5. What effects does the test have on its stakeholders? © UCLES 2014
  30. 30. Topical issues in assessment • Language testing’s links to educational / social / political policy. • Public accountability and ethical behaviour of test producers and users. • Technological advances are reshaping the design and delivery of language tests. • Growing understanding of language acquisition, development and use and advances in linguistics are affecting how we define and assess language proficiency. • We are reconceptualising communication in relation to pedagogy and assessment. © UCLES 2014
  31. 31. To summarise: A test should: • Consistently provide accurate measures of precisely the abilities in which we are interested (VALIDITY & RELIABILITY) • Have a beneficial effect on teaching and learning (IMPACT) • Be economical in terms of time and money (PRACTICALITY) Be FIT FOR PURPOSE Different test purposes: to measure communicative language ability to measure lexico-grammatical knowledge to measure success in achieving course objectives to assist in placement of students into different groups © UCLES 2014
  32. 32. 5. Impact of assessment • Who is affected by assessment in general? • What are some potential benefits? • What challenges can you think of? • What can we do to alleviate these challenges? One approach links learning, teaching and assessment © UCLES 2014
  33. 33. 6. Teaching, Learning and Assessment: How do they connect? • Through Learning Oriented Assessment • Assessment for learning as well as assessment of learning, involving feedback and feed forward © UCLES 2014
  34. 34. Locating LOA within the educational landscape “LOA a kind of formative assessment?” ...a kind of summative assessment?” ?” © UCLES 2014
  35. 35. Formative assessment © UCLES 2014
  36. 36. • is on-going assessment during a period of study • responds to the evolving needs of the learner • relates to identified learning objectives • implies scaffolding learning to help learners reach identified learning objectives Formative assessment © UCLES 2014
  37. 37. Which of these activities can be part of formative assessment? A. Observing learners during a speaking activity and identifying points for further development. B. Setting regular progress tests. C. Noting down learners’ mistakes in a writing activity to do further work on in class. D. Evaluating learners’ responses to a listening activity. Which elements are they having difficulty with, what do they need to work on and what have they understood well? E. All of the above. © UCLES 2014
  38. 38. Formative assessment: pros & cons • Has a natural affiliation with teaching and learning • Emphasises interaction, support and development BUT • Often based on the teacher’s intuition • Seen as lacking reliability & validity © UCLES 2014
  39. 39. Summative assessment © UCLES 2014
  40. 40. • involves tests at the end of a period of study (e.g. unit, week, term, course) • is typically linked to and looks back at the syllabus • is an indication of the learner’s ability or overall proficiency • often used for certification purposes Summative assessment © UCLES 2014
  41. 41. • results can be generalised beyond test context • tends to be designed with validity and reliability in mind BUT • is often perceived as just “grading” • could involve “teaching to the test” Summative assessment: pros & cons © UCLES 2014
  42. 42. Evaluating what has happened before A kind of judgement Guiding what happens next A kind of purpose Formative Summative Traditional distinction © UCLES 2014
  43. 43. Multiple functions of assessment “Every act of assessment we devise or have a role in implementing has more than one purpose. If we do not pay attention to these multiple purposes we are in danger of inadvertently sabotaging one or more of them Assessment has to serve double duty.” Boud (2000:159) © UCLES 2014
  44. 44. Provide evidence/feedback Support learning = Learning oriented assessment Classroom-based assessment Large-scale standardised assessment Whatever form assessment takes © UCLES 2014
  45. 45. Strengthening the link between learning, teaching and assessment “ for all assessments whether predominantly summative or formative in function a key aim is to promote productive student learning.” Carless (2009:80) Defining LOA: “[LOA] involves the collection and interpretation of evidence about performance so that judgements can be made about further language development” to promote learning Purpura (2004:236) © UCLES 2014
  46. 46. Learning Oriented Assessment • captures the centrality of learning within assessment (not an afterthought) • challenges the traditional view that exams are external and summative © UCLES 2014
  47. 47. A Model of LOA • the macro level – framing educational goals and evaluating outcomes (policy context) • the micro level – individual learning interactions which take place within and outside of the classroom (learning environment) © UCLES 2014
  48. 48. Key features LOA relies on a systematic approach to assessment: • Quality/appropriacy of evidence gathered and interpretation made • Appropriacy of feedback/modifications to instruction • Development of learner autonomy / life-long learning • An alignment between external measures and classroom-based assessment © UCLES 2014
  49. 49. What evidence of learning do you use? When and how do you collect it? How do you ‘know’ that it is useful evidence? Record Collecting evidence © UCLES 2014
  50. 50. Collecting evidence Use multiple sources (triangulation) • scores (tests/quizzes) • observation (performance) • past experiences with similar learners • learners themselves • engaging in action research Evidence tells you something about learner ability © UCLES 2014
  51. 51. Learning Oriented Assessment Integrating different forms of assessment Basing assessment on learning objectives Using assessment to support learning Single information system for: Individual feedback Monitoring progress Student and class profiles End of course reports © UCLES 2014
  52. 52. Who benefits from LOA? • multiple examples of evidence for learning • clear evidence of progress towards learning objectives • work at right level • receive relevant and timely feedback • become independent learners • monitor progress towards targets • valid, reliable and recognised certification © UCLES 2014
  53. 53. 8. Applying assessment knowledge in your context 1. We have to obtain knowledge and then apply it to make sense of what we have learnt. 2. One example is through undertaking Action Research. © UCLES 2014
  54. 54. What is Action Research? AR involves teachers exploring a specific challenge that they have identified in their own teaching context through several cycles of research AR is a form of teacher research. © UCLES 2014
  55. 55. “It is practitioners in their immediate social situation who are best placed to understand, examine and innovate in curriculum-related issues and challenges.” (Burns, 2011) Why should teachers do Action Research? © UCLES 2014
  56. 56. Examples of AR projects • Developing reading skills of Arabic students • Formative assessment in a Web 2.0 environment • Student attitudes to EAP grammar instruction • Encouraging students to become independent learners through self-assessment and reflection • Using writing rubrics to develop learner autonomy • Creating a blog for self-assessment • Introducing learning portfolios © UCLES 2014
  57. 57. Summary of impact of Action Research • teaching/research skills & knowledge • professional development • longer-term impact • new reputational dimension • rejuvination of practice • career options • programme dissemination • strengthened practice • engagement & motivation • enhanced PD & professionalism • ‘ripple-effect’ 57 Sectoral Institutional Individual © UCLES 2014
  58. 58. 9. Find out more: Assessment resources 1. Self-access materials including webinars and videos and materials produced by exam boards (see following slides for Cambridge English examples). 2. Online or F2F courses run by: ALTE www.alte.org/events, EALTA www.ealta.eu.org/events, CET www.cambridgeenglishteacher.org/courses or CAN www.cambridgeassessment.org.uk/events/cppa 3. Join a discussion list or online group for teachers or researchers, e.g. ESOL-RESEARCH@JISCMAIL.AC.UK, or try links at http://iteslj.org/links/ 4. Talk to your colleagues about assessment. 5. Reflect on your/your learners’ experiences. 6. Find out more about action research, consider doing your own project on an assessment-related issue. © UCLES 2014
  59. 59. Cambridge English resources • Webinars for teachers: www.cambridgeenglish.org/webinars/ • Cambridge English TV: www.youtube.com/user/cambridgeenglishtv • Cambridge English Teacher: www.cambridgeenglishteacher.org/ • Teacher support website: www.teachers.cambridgeesol.org • Principles of Good Practice booklet: www.cambridgeenglish.org/research-and- validation/quality-and-accountability/ © UCLES 2014
  60. 60. Find out more: Action Research Some key writers: • Simon Borg, resources at: http://simon-borg.co.uk/free- sources-of-language-teaching-research • Anne Burns, start with her Action Research video at: http://professoranneburns.com/arvideo.htm Online resources: • Action Research in Education course - www.edu.plymouth.ac.uk/resined/actionresearch/arhome.htm • Research Notes 44, 48, 53 (reports of Australian projects): www.cambridgeenglish.org/researchnotes/ • Recent webinar at www.CambridgeEnglishTeacher.org © UCLES 2014
  61. 61. Find out more: LOA Online resources: • Cambridge English approach, resources and FAQs: www.cambridgeenglish.org/research-and- validation/fitness-for-purpose/ • Webinar and videos on Cambridge English TV (You Tube) © UCLES 2014
  62. 62. What have we achieved today? • Explored some of the key concepts of assessment and our own experiences of assessment, before thinking about its impact on teaching/learning and society. • Thought about how learners and teachers could benefit from a better understanding and use of assessment, through a Learning Oriented Assessment approach or undertaking some Action Research. • Looked at some of the resources available. • Laid the groundwork to enable you to think of how assessment literacy can be applied in your context. . © UCLES 2014
  63. 63. References 1 ALTE Multilingual Glossary of Language Testing Terms. (1998) UCLES / Cambridge University Press. Bachman, L and Palmer A (1996) Language Testing in Practice. Oxford: Oxford University Press. (See Chapters 2 and 4) Boud, D (2000) Sustainable assessment: rethinking assessment for the learning society. Studies in Continuing Education, 22 (2) 151-167. Boud, D (2006) Foreword. In How Assessment Supports Learning: learning- oriented assessment in action by Carless, D, Joughlin, G, Liu, N F, & Associates, Hong Kong: Hong Kong University Press. Burns, A (2010) Doing action research in English language teaching. A guide for practitioners, New York: Routledge. Burns, A (2012) Teacher research in a national programme: Impact and implications, Research Notes 48, 3–7.
  64. 64. References 2 Carless, D (2007) Learning-oriented assessment: conceptual issues and practical implications. Innovations in Education and Teaching International , 44 (1), 57–66. Carless, D (2009) Learning-oriented Assessment: Principles and Practice and a Project. In Meyer, L H, Davidson, H, Anderson, R, Fletcher, P M, Johnston & Rees, M (Eds), Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research. Wellington: New Zealand. Hughes, A (1989/2003) Testing for Language Teachers. Cambridge: Cambridge University Press. McNamara, T (2000) Language Testing. Oxford: Oxford University Press. © UCLES 2014
  65. 65. References 3 Purpura, J (2004). Learning-oriented assessments of grammatical ability. In Assessing Grammar, Cambridge: Cambridge University Press. Weir, C and Milanovic, M (2003) Continuity and Innovation: Revising the Cambridge Proficiency in English Examination 1913-2002. UCLES/Cambridge University Press. Weir, C J (2005) Language Testing and Validation: An evidence-based approach. Oxford: Palgrave Macmillan. © UCLES 2014

×