11. TESTA improves students’ perceptions
of A&F…
60%
65%
70%
75%
80%
85%
90%
95%
Q5 Q6 Q7 Q8 Q9 OS
AVERAGENSSSCORES
COMPARISON OF 32 PROGS IN 13 UNIVERSITIES WITH SECTOR SCORES
NSS 2015 SCORES TESTA SCORES
12. …and improves the staff experience
More
engaging
formative
Less
measuring
Students
learning
more
Curriculum
less stuffed
13. Activity One: mock audit
Programme
Team
Meeting
Assessment
Experience
Questionnaire
(AEQ)
TESTA
Programme
Audit
Student
Focus Groups
14. The Audit: Caveats
1. Audit is not everything
2. Official discourse
3. Planned curriculum
4. Some better data, some weaker, some gaps
15. • Some context
• Number of summative
• Number of formative
• Varieties of assessment
• Proportion of exams
• Written feedback
• Speed of return of
feedback
Summary of audit data
18. Make sense of audit data
1) What is striking about the data?
2) What surprises or puzzles you?
3) What student learning
behaviours might the assessment
patterns foster?
4) What do you want to know
more about?
Each table look at:
• 1 x overview data
OR
• 1 x discipline
OR
• 1 x university type.
• Briefly discuss in relation
to questions
19. Typical A&F patterns
73 programmes in 14 unis (Jessop and Tomas 2017)
Characteristic Low Medium High
Volume of summative
assessment
Below 33 40-48 More than 48
Volume of formative only Below 1 5-19 More than 19
% of tasks by examinations Below 11% 22-31% More than
31%
Variety of assessment
methods
Below 8 11-15 More than 15
Written feedback in words Less than
3,800
6,000-7,600 More than
7,600
22. Lies, damned lies and statistics…
• The AEQ constructs
• Filling out the AEQ
• The scales
• Analysing AEQ & audit
data
23. Why a questionnaire about assessment?
• Weaker NSS scores
• Weak NSS?
• Quick and big data
• Quant/qualitative
24. AEQ 3.3 (2003)
• Designed to measure ‘conditions under which
assessment supports learning’
• Based on theory and evidence + selected CEQ scales
• Robust enough factor structure and scale coherence
– measures what it’s meant to be measuring?
25. AEQ 4.0 (2018)
• Fill in the AEQ 4.0 from the vantage point of
being a student in one of your classes
• Paper or online?
• https://educ.sphinxonline.net/v4/s/ha2fbs
26. Comparing Audit and AEQ data
from one programme
In pairs or groups, explore programme
audit and AEQ data from one programme.
Does anything stack up? Are there loose
ends, questions, contradictions?
28. Where to find focus group guidance
• How to run a focus group>resources>research toolkits
http://www.testa.ac.uk/index.php
• Role play>resources>focus group schedule
www.testa.ac.uk
29. Main pointers for focus group
• Questions are broad themes
• Easy to complicated
• Sit in a circle
• It’s the discussion that matters
• Go with the flow
• But steer when off topic, direct, pass the ball
• Troubleshooting
• Ethics
31. Getting students to attend…
• Get the support of lecturers, programme team
• Explore using student researchers
• Use vouchers
• Food
• Between 3 and 8 students for one hour
• Ethics and confidentiality
32. What the data looks like:
…and the intelligent transcript
texttoMP3
https://transcribe.wreally.com/
34. Raw focus group data
• Read section of full transcript
• Highlight relevant text
• Develop a few codes
• Suggest a few themes
• Devise three headlines
• Identify quotations under each headline
35. Have a go at triangulating data
• Read through audit, AEQ and focus group data from one
programme
• Quick abstract/bullet points of what seems to be going on
• Discuss with your group/flipchart
a) What are the stand out themes?
b) What jigsaw pieces fit together?
c) What unresolved issues remain?
36. The tone of the case study
• Build a narrative thread
• Descriptive, non-evaluative tone
• Empathetic
• Surprises, puzzles, contradictions
• Balancing weak and strong features
• Admitting gaps, interpretation, errors
• Not prescriptive, but give a steer & create
options
40. What doesn’t work: lessons learned
• Too much information
• Too much negative information
• Lack of soft stuff – food, drinks, chat, humour,
empathy, conducive spaces
• An inquorate team meeting
• Focusing on modules
41. What has worked and why
• Post-it predictions beforehand
• Trust and confidentiality
• Admitting gaps, listening
• Respect for disciplines
• Team ownership
• One pages notes – “you said”
• Focus on the whole programme
42. The TESTA effect
• Helps teams to talk about whole programme design
• Acting on evidence and principles
• Formative assessment
• Develops connections within/across modules
• Feedback as dialogue
• Greater knowledge and confidence among teachers about
assessment for learning
• And…improved NSS scores
43.
44. References
Barlow, A. and Jessop, T. 2016. “You can’t write a load of rubbish”: Why blogging works as formative
assessment. Educational Developments. 17(3), 12-15. SEDA.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning
and Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout:
High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education.
Jessop, T. and Tomas, C. 2017. The implications of programme assessment on student learning. Assessment
and Evaluation in Higher Education.
Jessop, T. and Maleckar, B. 2016. The Influence of disciplinary assessment patterns on student learning: a
comparative study. Studies in Higher Education. Published Online 27 August 2014
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a
nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 -217.
Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems’, Instructional Science,
18(2), pp. 119–144.
Tomas, C and Jessop, T. 2018. Struggling and juggling: A comparison of student assessment loads across
research and teaching-intensive universities. Assessment and Evaluation in Higher Education. 18 April.
Wu, Q. and Jessop, T. 2018. Formative assessment: missing in action in both research-intensive and teaching-
focused universities. Assessment and Evaluation in Higher Education. Published online 15 January.
Editor's Notes
Tansy
Disconnected seeing the whole degree in silos – my module, lecturer perspective (Elephant, trunk, ears, tusks etc) compared to student perspective of the whole huge beast. I realise that what we were saying is two per module
Not so good for complex learning, integrating knowledge, lends itself to disposable curriculum fragmented learning. Amplified summative, less time for formative. Hard to make connections, difficult to see the joins between assessments, much more assessment, much more assessment to accredit each little box. Multiplier effect. Less challenge, less integration. Lots of little neo-liberal tasks. The Assessment Arms Race.
Language of ‘covering material’ Should we be surprised?
The TESTA report back of programme findings was by far the most significant meeting I have attended in ten years of sitting through many meetings at this university. For the first time, I felt as though I was a player on the pitch, rather than someone watching from the side-lines. We were discussing real issues.
(Senior Lecturer, Education
Tansy
In the UK, assessment and feedback are primary areas of disquiet in the NSS. Provide very little diagnostic information to help course teams adopt more effective assessment strategies. Every year, routine charts red/green/orange – visual representation, accompanied by ritual humiliation of programmes, but not sure why or how to change! AEQ has been used in many countries
Looked at in the overview – student effort; intellectual challenge, focused on understanding rather than memorising or ‘sufficing’; clear about goals and stds; feedback is effective – students read it, understand it, use it to improve what they do next.
Cronbach’s Alpha – sounds like a disease to me but a test to measure the internal consistency of items – do all items measure the same construct?
Tansy
Codes look at small units of meaning – a student says – it takes 4 weeks to get it back, so you’ve already handed in the next one, or the tasks are different one after the other, or I never bother at the end of the course because it’s over. All of these are partly to do with timing, and they contribute to the theme of students not using their feedback. Another sub reason is that students don’t use feedback is because they don’t trust it so they say things like If x marks it you’ll get a good grade, if y a bad, or if you get her on a good day, or it’s so subjective etc
Please stay on the same data set – as AEQ session. We will work it into a case profile.
Use flip chart
What does it feel like to be a student? What does it feel like to be a lecturer at the end of this? An empathetic balanced reporting.
Tansy
Big guns, multiple agendas, using TESTA to get systemic changes done, before we did headlines, lots of themes and literature – there is lit to back up but for team briefing we keep it implicit, pull it out theory informally and illustratively in conversation