More Related Content
Similar to Assessnent literacy apr 14 (fiona barker)
Similar to Assessnent literacy apr 14 (fiona barker) (20)
Assessnent literacy apr 14 (fiona barker)
- 1. Assessment literacy: What is it
and how best to achieve it?
Fiona Barker
Cambridge English Language Assessment
© UCLES 2014
- 2. Outline
1. Rationale for this workshop
2. Our experiences of assessment
3. Our view of assessment
4. An exam board‟s approach to assessment
5. Impacts of assessment
6. Connecting teaching, learning and assessment
7. Find out more: assessment resources
8. Applying assessment knowledge in your context
© UCLES 2014
- 3. 1. Rationale
• Explore some of the key concepts of assessment that all
practitioners should have a working knowledge of in order
to teach effectively and to make the most of assessment
opportunities.
• Think about how our learners would benefit from a better
understanding of why and how they are assessed and
how to support them to become more autonomous.
• We will draw upon existing approaches and our
experiences of learning about and practising assessment
in a variety of contexts, including language schools and in
an exam board.
© UCLES 2014
- 4. To start us thinking, ask yourself:
• What do I know about assessment already?
• What would I like to know about assessment?
• What do I need to do to get to this level of
understanding and be able to apply it?
© UCLES 2014
- 6. 2.1 Learning about assessment
1. How did you learn about assessment in your role as a
teacher/trainer/manager etc.?
2. Were you explicitly taught about assessment or
expected to pick it up by yourself, or something else
happened?
© UCLES 2014
- 7. 2.2 Being assessed
1. What is your earliest memory of being assessed?
2. What was positive/negative about it for you?
3. Do you have another positive or negative memory of
being assessed?
© UCLES 2014
- 9. 3. Our view of assessment
• What is it?
• Are there different types?
• How can it be done?
© UCLES 2014
- 12. Are there different types?
• Name as many different types of assessment as you can
• Tests
• Quizzes
• Portfolios
• Recordings
• Presentations
• etc
© UCLES 2014
- 13. Aspects of assessment
Who?
• Assesses: Self / Peer /
Teacher / Examiner
• Takes part: singleton /
pair / group
When?
• planned or ad hoc
• regularly – e.g. every
lesson, end of week /
topic / module / term /
etc.
What conditions?
• Exam or freer
• Timed or not
• Support or resources
permitted
• Online or offline
• Face to face or distance
Where?
• Exam hall / classroom /
home / testing centre /
other
Why?
• Many purposes
© UCLES 2014
- 14. 4. An exam board‟s approach
• Purpose of assessment
• Brief history
• Essential test qualities
•
• How to evaluate a test
• Topical issues
© UCLES 2014
- 15. Purpose of assessment
• Language assessment aims to measure a hidden
trait (i.e. language ability).
• This measurement allows us to make inferences
about an individual‟s language ability.
• We make inferences on the basis of an individual‟s
observed behaviour(s) in the assessment situation,
usually on the basis of test scores attached to them –
score interpretation.
• We need to think about the correspondence between
general language use in a non-test context (the
target language use situation) and specific
performance in a language test.
© UCLES 2014
- 17. Brief history
• Discrete point tests (1960s)
Language broken down into components and tested
separately, e.g. reading, grammar, phonology
• Integrative testing (1970s)
Tests tap into several competencies, e.g. writing a
composition, cloze test, dictation
• Communicative language testing & performance
based assessment (1980s onwards)
Correspondence between test and non-test situations, e.g.
oral/written production, open-ended responses, integrated
tasks
© UCLES 2014
- 18. Example: Measuring Anxiety
• What behaviours are associated with anxiety?
• How could we measure these behaviours?
• We would then attach a level or score to these
behaviours to interpret what they mean.
© UCLES 2014
- 20. Validity
CONSTRUCT VALIDITY: central concept
Extent to which test scores can be considered a true
reflection of underlying ability the test is trying to measure
Q: Does the test measure what it is supposed to measure?
FITNESS FOR PURPOSE
Tests/test scores are not „valid‟, they are VALID FOR A
PURPOSE
© UCLES 2014
- 21. Reliability
Extent to which test scores are consistent and accurate,
and therefore free of measurement error
Q: Is there consistency of measurement?
Test reliability and rater reliability
• Objectively marked tests, e.g. a multiple choice test
statistical measures
• Subjectively marked tests, e.g. a writing/speaking test
rater reliability
Q: How can test reliability be increased?
© UCLES 2014
- 22. Enhancing test reliability
• Take enough samples of behaviour.
• Do not allow candidates too much freedom:
Write a composition on tourism vs. Write a composition on how
the tourist industry in this country might be developed.
• Write unambiguous items.
• Provide clear and explicit instructions.
• Ensure candidate familiarity with format/testing techniques.
• Provide uniform administration conditions.
• Make scoring as objective as possible.
• Train scorers and get multiple, independent scoring.
(Hughes, 1989)
© UCLES 2014
- 23. Impact
The effect of a test on test takers, education systems and
society more widely:
„the larger framing and social meaning of assessment‟
(MacNamara, 2000)
• Micro: effect on classrooms
(washback)
• Macro: effect on society
© UCLES 2014
- 24. Ensuring positive washback
and impact
• Test only those abilities whose development you want to
encourage and not what is easiest to test.
• Give sufficient weight to those abilities.
• Sample widely from the non-test domain.
• Use direct testing (i.e. performance skills).
• Make testing criterion-referenced.
• Provide assistance to teachers.
© UCLES 2014
- 27. How to evaluate a test
Claims
Evidence
Weakness/strength of claims
Building a validity argument
© UCLES 2014
- 28. Useful framework for evaluating tests
Cyril Weir‟s (2005) „socio-cognitive framework‟ for
validating language tests
a framework of questions about the validity of
language tests
© UCLES 2014
- 29. Weir‟s (2005) Socio-cognitive framework
TEST TAKER
CHARACTERISTICS
CONTEXT VALIDITY
COGNITIVE
VALIDITY
RESPONSE
SCORING VALIDITY
SCORE/GRADE
CRITERION-
RELATED VALIDITY
CONSEQUENTIAL
VALIDITY
4. How far can we
depend on the
scores of the test?
2. Are the cognitive
processes required to
complete the tasks
appropriate?
1. How are TT
characteristics catered
for by this test?
6. What external
evidence is there
beyond test scores
that the test is doing a
good job?
3. Are the
characteristics
of the test
tasks and
administration
fair to TTs?
5. What
effects does
the test have
on its
stakeholders? © UCLES 2014
- 30. Topical issues in assessment
• Language testing‟s links to educational / social / political
policy.
• Public accountability and ethical behaviour of test
producers and users.
• Technological advances are reshaping the design and
delivery of language tests.
• Growing understanding of language acquisition,
development and use and advances in linguistics are
affecting how we define and assess language
proficiency.
• We are reconceptualising communication in relation to
pedagogy and assessment.
© UCLES 2014
- 31. To summarise:
A test should:
• Consistently provide accurate measures of precisely
the abilities in which we are interested (VALIDITY &
RELIABILITY)
• Have a beneficial effect on teaching and learning
(IMPACT)
• Be economical in terms of time and money
(PRACTICALITY)
Be FIT FOR PURPOSE
Different test purposes:
to measure communicative language ability
to measure lexico-grammatical knowledge
to measure success in achieving course objectives
to assist in placement of students into different groups …
© UCLES 2014
- 32. 5. Impact of assessment
• Who is affected by assessment in general?
• What are some potential benefits?
• What challenges can you think of?
© UCLES 2014
- 33. • What challenges can you think of?
• What can we do to alleviate these challenges? One
approach links learning, teaching and assessment.
© UCLES 2014
- 34. 6. Teaching, Learning and
Assessment: How do they connect?
• Through Learning Oriented Assessment
• Assessment for learning as well as assessment of
learning, involving feedback and feed forward
© UCLES 2014
- 35. Locating LOA within the educational
landscape
“LOA
…a kind of formative assessment?”
...a kind of summative assessment?”
…?”
© UCLES 2014
- 37. • is on-going assessment ___ a period of study
• responds to the evolving needs of the ___
Formative assessment
© UCLES 2014
- 38. • is on-going assessment during a period of study
• responds to the evolving needs of the learner
• relates to identified learning objectives
• implies scaffolding learning to help learners
reach identified learning objectives
Formative assessment
© UCLES 2014
- 39. Which of these activities can be part of
formative assessment?
A. Observing learners during a speaking activity and
identifying points for further development.
B. Setting regular progress tests.
C. Noting down learners‟ mistakes in a writing activity to do
further work on in class.
D. Evaluating learners‟ responses to a listening activity.
Which elements are they having difficulty with, what do
they need to work on and what have they understood
well?
E. All of the above.
© UCLES 2014
- 40. Formative assessment: pros & cons
• Has a natural affiliation with teaching and
learning
• Emphasises interaction, support and
development
BUT…
• Often based on the teacher‟s intuition
• Seen as lacking reliability & validity
© UCLES 2014
- 42. • involves tests ___ of a period of study
• is typically linked to and looks ___ at the
syllabus
Summative assessment
© UCLES 2014
- 43. • involves tests at the end of a period of study
(e.g. unit, week, term, course)
• is typically linked to and looks back at the
syllabus
• is an indication of the learner‟s ability or
overall proficiency
• often used for certification purposes
Summative assessment
© UCLES 2014
- 44. • results can be generalised beyond test
context
• tends to be designed with validity and
reliability in mind
BUT
• is often perceived as just “grading”
• could involve “teaching to the test”
Summative assessment: pros & cons
© UCLES 2014
- 45. Evaluating what has happened before
A kind of judgement
Guiding what happens next
A kind of purpose
Formative Summative
Traditional distinction
© UCLES 2014
- 46. Multiple functions of assessment
“Every act of assessment we devise or have a
role in implementing has more than one
purpose. If we do not pay attention to these
multiple purposes we are in danger of
inadvertently sabotaging one or more of them …
Assessment has to serve double duty.”
Boud (2000:159)
© UCLES 2014
- 48. Strengthening the link between
learning, teaching and assessment
“… for all assessments whether predominantly
summative or formative in function a key aim is to
promote productive student learning.”
Carless (2009:80)
Defining LOA:
“[LOA] involves the collection and interpretation of
evidence about performance so that judgements
can be made about further language development”
… to promote learning
Purpura (2004:236)
© UCLES 2014
- 49. Learning Oriented Assessment
• captures the centrality of learning within assessment
(not an afterthought)
• challenges the traditional view that exams are
external and summative
© UCLES 2014
- 50. A Model of LOA
• the macro level – framing educational goals
and evaluating outcomes (policy context)
• the micro level – individual learning
interactions which take place within and
outside of the classroom (learning
environment)
© UCLES 2014
- 51. learning
objectives
Overview of a model of LOA
course
Task
Language
activity
Interpretation
Record
Informal record
Teacher decision-making
Feedback & modify
learning objectives
Structured record
Record of
achievement
Interpretation
Monitoring of
performance
External
exam
Macro level (setting and
monitoring targets) Micro level (materials,
classroom practice)
Observat
/ evidenc
gathering
Whole group, subgroup
individual characteristic
© UCLES 2014
- 52. Key features
LOA relies on a systematic approach to
assessment:
• Quality/appropiacy of evidence gathered and
interpretation made
• Appropriacy of feedback/modifications to instruction
• Development of learner autonomy / life-long learning
• An alignment between external measures and
classroom-based assessment
© UCLES 2014
- 53. What evidence of learning do you use?
When and how do you collect it?
How do you „know‟ that it is useful evidence?
Record
Collecting evidence
© UCLES 2014
- 54. Collecting evidence
Use multiple sources (triangulation)
• scores (tests/quizzes)
• observation (performance)
• past experiences with similar learners
• learners themselves
• engaging in action research
Evidence tells you something about learner ability
© UCLES 2014
- 55. Learning Oriented Assessment
Integrating different
forms of assessment
Basing assessment on
learning objectives
Using assessment to
support learning
Single information
system for:
Individual feedback
Monitoring progress
Student and class profiles
End of course reports
© UCLES 2014
- 56. Who benefits from LOA?
• multiple examples of evidence for
learning
• clear evidence of progress towards
learning objectives
• work at right level
• receive relevant and timely feedback
• become independent learners
• monitor progress towards targets
• valid, reliable and recognised
certification
© UCLES 2014
- 57. 8. Applying assessment
knowledge in your context
1. We have to obtain knowledge and then apply it to make
sense of what we have learnt.
2. One example is through undertaking Action Research.
© UCLES 2014
- 58. What is Action Research?
AR involves teachers exploring a specific challenge
… that they have identified
… in their own teaching context
… through several cycles of research
AR is a form of teacher research.
© UCLES 2014
- 59. “It is practitioners in their immediate social
situation who are best placed to understand,
examine and innovate in curriculum-related
issues and challenges.”
(Burns, 2011:3)
Why should teachers do Action
Research?
© UCLES 2014
- 60. Examples of AR projects
• Developing reading skills of Arabic students
• Formative assessment in a Web 2.0 environment
• Student attitudes to EAP grammar instruction
• Encouraging students to become independent
learners through self-assessment and reflection
• Using writing rubrics to develop learner autonomy
• Creating a blog for self-assessment
• Introducing learning portfolios
© UCLES 2014
- 61. Summary of impact of Action
Research
• teaching/research skills &
knowledge
• professional development
• longer-term impact
• new reputational dimension
• rejuvination of practice
• career options
• programme dissemination
• strengthened practice
• engagement & motivation
• enhanced PD & professionalism
• ‘ripple-effect’
61
Sectoral
Institutional
Individual
© UCLES 2014
- 62. 9. Find out more: Assessment
resources
1. Self-access materials including webinars and videos
and materials produced by exam boards.
2. Courses run by: ALTE, EALTA, CET and so on.
3. Join a discussion list or online group for teachers or
researchers.
4. Talk to your colleagues about assessment.
5. Reflect on yourself as a learner and on your
learners‟ experiences.
6. Find out more about action research, consider doing
your own project on an assessment-related issue.
© UCLES 2014
- 63. Cambridge English resources
• Webinars for teachers:
www.cambridgeenglish.org/webinars/
• Cambridge English TV:
www.youtube.com/user/cambridgeenglishtv
• Cambridge English Teacher:
www.cambridgeenglishteacher.org/
• Teacher support website:
www.teachers.cambridgeesol.org
• Principles of Good Practice:
www.cambridgeenglish.org/research-and-
validation/quality-and-accountability/
© UCLES 2014
- 64. Find out more: Action Research
Some key writers:
• Simon Borg, resources at: http://simon-borg.co.uk/free-
sources-of-language-teaching-research
• Anne Burns, start with her Action Research video at:
http://professoranneburns.com/arvideo.htm
Online resources:
• Action Research in Education course -
www.edu.plymouth.ac.uk/resined/actionresearch/arhome.htm
• Research Notes 44, 48, 53 (reports of Australian projects):
www.cambridgeenglish.org/researchnotes/
• Recent webinar at www.CambridgeEnglishTeacher.org
© UCLES 2014
- 65. Find out more: LOA
Online resources:
• Cambridge English approach, resources and FAQs:
www.cambridgeenglish.org/research-and-
validation/fitness-for-purpose/
• Webinar and videos on Cambridge English TV (You Tube)
© UCLES 2014
- 66. What have we achieved today?
• Explored some of the key concepts of assessment and
our own experiences of assessment, before thinking
about its impact on teaching/learning and society.
• Thought about how learners and teachers could benefit
from a better understanding and use of assessment,
through a Learning Oriented Assessment approach or
undertaking some Action Research.
• Looked at some of the resources available.
• Laid the groundwork to enable you to think of how
assessment literacy can be applied in your context.
. © UCLES 2014
- 67. Thank you for taking part today!
Any comments or questions?
For references & links, please contact me:
barker.f@cambridgeenglish.org