1. Evidence to Action:
Why TESTA works
Tansy Jessop
@tansyjtweets @TESTAwin
#SLTCC2016
24 June 2016
2. The plan
1. Brief overview of TESTA
2. Three change-related problems
3. Three findings and strategies
4. Do we need a paradigm shift?
3. Why assessment and feedback
matter...
1) Assessment drives what students pay attention
to, and defines the actual curriculum (Ramsden
1992).
2) Feedback is the single most important factor in
learning (Hattie 2009; Black and Wiliam 1998).
8. TESTA….
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
9. TESTA shifts in perspective from…
• ‘my’ module to ‘our’ programme
• from teacher-focused on module delivery to
student experience of whole programme
• from individualistic modular design to coherent
team design
• from the NSS to enhancement strategies
10. TESTA addresses three problems
Problem 1: Something’s going awry, but I’m not
sure why
Problem 2: Curriculum design problems
Problem 3: Evidence to action problem
15. The best approach from the student’s perspective is to focus
on concepts. I’m sorry to break it to you, but your students are
not going to remember 90 per cent – possibly 99 per cent – of
what you teach them unless it’s conceptual…. when broad,
over-arching connections are made, education occurs. Most
details are only a necessary means to that end.
http://www.timeshighereducation.co.uk/features/a-students-
lecture-to-rofessors/2013238.fullarticle#.U3orx_f9xWc.twitter
A Student’s lecture to her professor
16. Problem 3: Evidence to action gap
http://www.liberalarts.wabash.edu/study-overview/
Flawed Assumptions…
• Main problem lack of
high quality data
• Logic will prevail
• Systems will change
17. Proving is different from improving
“It is incredibly difficult to translate assessment
evidence into improvements in student learning”
“It’s far less risky and complicated to analyze data
than it is to act”
(Blaich & Wise, 2011)
18. Paradigm What it looks like
Technical rational Focus on data and tools
Relational Focus on people
Emancipatory Focus on systems and structures
19. TESTA themes and impacts
1. Variations in assessment patterns
2. High summative: low formative
3. Disconnected feedback
20. Defining the terms
• Summative assessment carries a grade which
counts toward the degree classification.
• Formative assessment does not count
towards the degree (either pass/fail or a
grade), elicits comments and is required to be
done by all students.
21. 1. Huge variations
• What is striking for
you about this data?
• How does it compare
with your context?
• Does variation
matter?
22. Assessment features across a 3 year UG degree (n=75)
Characteristic Range
Summative 12 -227
Formative 0 - 116
Varieties of assessment 5 - 21
Proportion of examinations 0% - 87%
Time to return marks & feedback 10 - 42 days
Volume of oral feedback 37 -1800 minutes
Volume of written feedback 936 - 22,000 words
23. Patterns over three year UK degrees
(n=75)
Characteristic Low Medium High
Volume of summative
assessment
Below 33 40-48 More than 48
Volume of formative only Below 1 5-19 More than 19
% of tasks by examinations Below 11% 22-31% More than 31%
Variety of assessment
methods
Below 8 11-15 More than 15
Written feedback in words Less than 3,800 6,000-7,600 More than 7,600
24. Actions based on evidence
a) Reduction in summative
b) Increase in formative
c) Streamlined varieties
d) More, less or different feedback depending…
e) Used evidence to inform a team approach to
curriculum design
f) Every time a coconut with each feature
25. Theme 2: High summative: low formative
• Summative ‘pedagogies of control’
• Circa 2 per module in UK
• Ratio of 1:8 of formative to summative
• Formative weakly understood and practised
26. What students say…
• A lot of people don’t do wider reading. You just focus
on your essay question.
• In Weeks 9 to 12 there is hardly anyone in our
lectures. I'd rather use those two hours of lectures to
get the assignment done.
• It’s been non-stop assignments, and I’m now free of
assignments until the exams – I’ve had to rush every
piece of work I’ve done.
27. What students say about formative
• If there weren’t loads of other assessments, I’d do
it.
• If there are no actual consequences of not doing it,
most students are going to sit in the bar.
• It’s good to know you’re being graded because you
take it more seriously.
• The lecturers do formative assessment but we
don’t get any feedback on it.
29. Actions based on evidence
1. Rebalance summative and formative
2. It’s a programme decision
3. Formative in the public domain
4. Link formative and summative
5. Require formative to mark summative
6. Authentic assessment tasks work best…
30. Case Study 1
• Entire Business School (WBS)
• Reduction from average 2 x summative, zero
formative per module
• …to 1 x summative and 3 x formative
• All working to similar script
• Systematic shift, experimentation, less risky
together
31. Case Study 2: Blogging as formative
• Modular approach
• Students read, write, and think more
• Teach less, learn more
• Dialogic, creative, reflective
• Personalises learning
• Develops ‘self-authorship’
• Authentic digital task
33. Take five
• Choose a quote that
strikes you.
• What is the key issue?
• What strategies might
address this issue?
34. What students say…
The feedback is generally focused on the module.
It’s difficult because your assignments are so detached
from the next one you do for that subject. They don’t
relate to each other.
Because it’s at the end of the module, it doesn’t feed
into our future work.
I read it and think “Well, that’s fine but I’ve already
handed it in now and got the mark. It’s too late”.
35. Students say the feedback relationship is
broken…
Because they have to mark so many that our
essay becomes lost in the sea that they have to
mark.
It was like ‘Who’s Holly?’ It’s that relationship
where you’re just a student.
Here they say ‘Oh yes, I don’t know who you are.
Got too many to remember, don’t really care, I’ll
mark you on your assignment’.
36. Actions based on evidence
• Conversation: who starts the dialogue?
• Iterative cycles of reflection across modules
• Quick generic feedback: the ‘Sherlock’ factor
• Feedback synthesis tasks
• Reflecting on improvement in relation to past
performance
• Technology: audio, screencast and blogging
• From feedback as ‘telling’…
• … to feedback as asking questions
40. Impacts at Winchester
• Upwards trajectory on A&F scores on NSS on TESTA
programmes – ‘Top 4’ University
• TESTA ‘effect’ - people talk about formative
• Team approach to designing curricula
• Design cycle for periodic review includes TESTA
• Further research: JISC Fastech Project 2011/14
• Linked REACT Student engagement project 2015/17
41. References
Blaich, C., & Wise, K. (2011). From Gathering to Using Assessment Results: Lessons from the Wabash
National Study. Occasional Paper #8. University of Illinois: National Institution for Learning Outcomes
Assessment.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. doi:
10.1080/02602938.2012.691462.
Gibbs, G. & Simpson, C. (2004) Conditions r which assessment supports students' learning. Learning and
Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its
fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher
Education, 40(4), pp. 528–541. doi: 10.1080/02602938.2014.931927.
Hughes, G. (2014) Ipsative Assessment. Basingstoke. Palgrave MacMillan.
Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student
learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014
http://www.tandfonline.com/doi/abs/10.1080/03075079.2014.943170
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment
standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 — 217
Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional
Science, 18(2), pp. 119–144. doi: 10.1007/bf00117714.
Williams, J. and Kane, D. (2009) ‘Assessment and feedback: Institutional experiences of student
feedback, 1996 to 2007’, Higher Education Quarterly, 63(3), pp. 264–286.
Editor's Notes
How do you measure soft stuff? 5 day cricket match versus 20/20
What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles. Habermas framework.
Dominos fast food to Raymond Blanc slow learning
Data – persistent problem A&F scores. Traffic light systems – green for good. DVC find the people wo are doing well so we can share best practice. Three programmes. Neil McCaw
Hard to make connections, difficult to see the joins between assessments, much more assessment, much more assessment to accredit each little box. Multiplier effect. Less challenge, less integration. Lots of little neo-liberal tasks. The Assessment Arms Race.
Language of ‘covering material’ Should we be surprised?
Wabash study – 2005-2011, 17,000 students in 49 American colleges. 60-70 publications Critical thinking, moral reasoning, leadership towards social justice, engagement in diversity, deep intellectual work.
TESTA has done the data and that’s been useful. Ideological compromises. Mixed methods approaches. Critical pedagogy sleeping with the enemy. Democratic, participatory, liberating curriculum and pedagogy. Teachers and students shape and change education. Resist managerialism and the market. Risky pedagogies.
Teach Less, learn more. Assess less, learn more.
Impoverished dialogue Nicol, Mass Higher Education; Relationship