PANDITA RAMABAI- Indian political thought GENDER.pptx
Changing the assessment landscape
1. Changing the assessment landscape
@solentlearning
@tansyjtweets
Dr Tansy Jessop
Professor of Research Informed Teaching
#ISSOTL17, 12 October 2017
2. Outline
• Scoping the problem: your view and mine
• What is TESTA?
• Exploring why TESTA works as a change approach
• Linking it to principles of educational change
3. The big backwash
1) Assessment drives what students pay attention
to, and defines the actual curriculum (Ramsden
2003).
2) Feedback is the single most important factor in
learning (Hattie 2009; Black and Wiliam 1998).
4. Starting with you…
What is the main assessment & feedback challenge
you face?
Go to www.menti.com and use the code 54 65 24
Write down three words or phrases that spring to
mind
5. It was heavy, tons of
marking for the tutor. It
was such hard work. It was
criminal.
Media Course Leader
I’m really bad at reading
feedback. I’ll look at the
mark and then be like ‘well
stuff it, I can’t do anything
about it’
Student, TESTA focus group
Lose-lose situation
12. The best approach from the student’s perspective is to focus
on concepts. I’m sorry to break it to you, but your students are
not going to remember 90 per cent – possibly 99 per cent – of
what you teach them unless it’s conceptual…. when broad,
over-arching connections are made, education occurs. Most
details are only a necessary means to that end.
http://www.timeshighereducation.co.uk/features/a-students-
lecture-to-rofessors/2013238.fullarticle#.U3orx_f9xWc.twitter
A student’s lecture to her professor
13. The problem in a nutshell
• Modular curricula foster fragmentation
• Solutions applied to single modules
• Individual improvements holistic change
• Culture of blame and frustration
• We have a curriculum design problem
• We have a teamwork problem
• We have a systems problem…
17. TESTA….
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
18. It seems to work
60%
65%
70%
75%
80%
85%
90%
95%
Q5 Q6 Q7 Q8 Q9 OS
AVERAGENSSSCORES
COMPARISON 32 PROGRAMMES IN 13 UNIVERSITIES WITH SECTOR
SCORES
NSS 2015 SCORES TESTA SCORES
19. Why does it work?
• Addresses a burning question for academics
and students
• Uses rigorous research
• Builds on SOTL
• Engages whole teams in a collegiate process
• Provides practical strategies
• Engages with wider institutional systems
20. Burning questions (academics)
I was quite shocked when I discovered that
people just did things in a random way, but to me
it all makes sense. I was teaching in a vacuum.
I was struck by the sheer amount of assessment.
(Programme Leader Interviews)
21. The value was to look at what we do from a scientific
perspective and look at things objectively, and that is
really enabling us to re-think how we do things.
Robust data is compelling
I’ve found it useful to have
a mirror held up, to give a
real reflection. We talk
about the ‘student voice’,
but actually this has
provided a mechanism.
It’s been challenging. It
has shown us that
there is no room for
complacency. It also
has shown us that we
need to listen more to
what students are
saying.
22. It builds on SOTL
• ‘Time-on-task’ (Gibbs 2004)
• Challenging and high expectations (Chickering
and Gamson 1987; Arum and Roksa 2011)
• Internalising goals and standards (Sadler 1989;
Nicol and McFarlane-Dick 2006)
• Prompt, detailed, specific, developmental,
dialogic feedback practice and design (Gibbs
2004; Nicol 2010; Boud and Molloy 2013)
• Deep learning (Marton and Saljo 1976).
23. Collegial process
But I don’t think it’s just the tools. The tools are good
and they work really, really well, it’s also the
approach. It comes through a kind of collegiality.
(Programme Leader)
For the first time, I felt as though I was a player on
the pitch, rather than someone watching from the
side lines. We were discussing real issues
(Senior Lecturer).
24. Everybody has brought
in more formative. The
idea was to consolidate
the summative
assessment and bring in
more formative.
Do we want to continue
offering twenty different
types of assessment or
do we bite the bullet and
say “We want the
students to be able to
master five of them”?
There has been more of a
spacing of assessments.
Practical strategies
There is a lot more feed
forward, which is what
came out of the TESTA.
25. System-wide changes
• Quality assurance and
enhancement working
together
• 18 month design cycle
• TESTA required process
• Evidence-informed
curriculum design
• New lecturer course
curriculum design module
27. References
Arum, R. and Roksa, J. (2011) Academically Adrift: Limited Learning on College Campuses. Chicago.
University of Chicago Press.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712.
Gibbs, G. & Simpson, C. (2004) Conditions r which assessment supports students' learning. Learning and
Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout:
High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education.
Jessop, T. and Tomas, C. (2016) The implications of programme assessment on student learning. Assessment
and Evaluation in Higher Education. Published online 2 August 2016.
Jessop, T. 2016. Seven years and still no itch – why TESTA keeps going. Educational Developments, 17(3) 5-8.
SEDA.
Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student learning: a
comparative study. Studies in Higher Education. Published Online 27 August 2014
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Jessop, T., El Hakim, Y. and Gibbs, G. TESTA 2014: A way of thinking about assessment and feedback.
Educational Developments 14:3.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a
nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205-217.
Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional Science,
18(2), pp. 119–144. doi: 10.1007/bf00117714.
Editor's Notes
We need to do something. This is something. Let’s do it! Yes Minister – very different approach
Feedback: all that effort, but what is the effect? Margaret Price
But lots of projects and programmes do….
Disconnected seeing the whole degree in silos – my module, lecturer perspective (Elephant, trunk, ears, tusks etc) compared to student perspective of the whole huge beast. I realise that what we were saying is two per module
Not so good for complex learning, integrating knowledge, lends itself to disposable curriculum fragmented learning. Amplified summative, less time for formative. Hard to make connections, difficult to see the joins between assessments, much more assessment, much more assessment to accredit each little box. Multiplier effect. Less challenge, less integration. Lots of little neo-liberal tasks. The Assessment Arms Race.
Language of ‘covering material’ Should we be surprised? Knowledge wastage
What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles. Habermas framework.
Root, branch, ecological changes – Hargreaves and Fullan