Interactive TESTA masterclass
Part 2
@solentlearning
@tansyjtweets
Tansy Jessop
TESTA Project Leader
Durham University
3 May 2018
Theme 2: Disconnected feedback
The feedback is
generally focused
on the module
Because it’s at the end
of the module, it doesn’t
feed into our future
work.
If It’s difficult because your
assignments are so detached
from the next one you do for
that subject. They don’t
relate to each other.
I read it and think “Well,
that’s fine but I’ve already
handed it in now and got the
mark. It’s too late”.
STRUCTURAL
It was like ‘Who’s
Holly?’ It’s that
relationship where
you’re just a student.
Because they have to mark so
many that our essay becomes
lost in the sea that they have
to mark.
Here they say ‘Oh yes, I don’t
know who you are. Got too
many to remember, don’t
really care, I’ll mark you on
your assignment’.
RELATIONAL
A feedback dialogue
Irretrievable breakdown…
Your essay lacked structure and
your referencing is problematic
Your classes are boring and I
don’t really like you 
Ways to be dialogic
• Conversation: who starts the dialogue?
• Cycles of reflection across modules
• Quick generic feedback
• Feedback synthesis tasks
• Peer feedback (especially on formative)
• Technology: audio, screencast and blogging
• From feedback as ‘telling’…
• … to feedback as asking questions
Students feedback to us
Students to lecturers:
Critical Incident Questionnaire
Stephen Brookfield’s Critical Incident Questionnaire http://bit.ly/1loUzq0
Pause, review, discuss
(in discipline area groups?)
• What one feedback problem raised in TESTA
data resonates for you?
• What ideas or practices presented would work
in your context (or are already working)? Any
new and different ideas?
• How can you implement these ideas to engage
students meaningfully in using feedback?
Feedback manifesto
Groups work together to produce a set of feedback
principles that they commit to embedding in their
programmes. Flipchart paper on walls in plenary.
Theme 3: Confusion about goals and
standards
• Consistently low scores on the AEQ for clear
goals and standards
• Alienation from the tools, especially criteria
and guidelines
• Symptoms: perceptions of marker variation,
unfair standards and inconsistencies in practice
We’ve got two
tutors- one marks
completely differently
to the other and it’s
pot luck which one
you get.
They read the essay and then
they get a general impression,
then they pluck a mark from
the air.
It’s like Russian
roulette – you may
shoot yourself and
then get an A1.
They have different
criteria, they build up their
own criteria.
There are criteria, but I find them really
strange. There’s “writing coherently,
making sure the argument that you
present is backed up with evidence”.
Implicit
Criteria
Explicit
Written
I justify
Co-creation
and
participation
Active
engagement
by students
Taking action: internalising goals and
standards
• Regular calibration exercises
• Discussion and dialogue
• Discipline specific criteria (not cut and paste)
Lecturers
• Rewrite/co-create criteria
• Marking exercises
• Discussing range of exemplars
Lecturers
and students
• Enter secret garden - peer review
• Engage in drafting processes
• Self-reflection
Students
How not to do it: reverse engineering
How not to get an idea of
standards: Students
Half the participants work in
groups of five using flipchart
paper to design a system to
ensure that students never
come to know what ‘good’
looks like.
Use flipchart paper, text and
drawings.
How not to mark to a
common standard: Academics
The other half design a system
which ensures that academics
are prevented from marking to
the same standard.
Use flipchart paper, text and
drawings.
From this educational paradigm…
The transmission model
Social constructivist
model
References
Barlow, A. and Jessop, T. 2016. “You can’t write a load of rubbish”: Why blogging works as formative
assessment. Educational Developments. 17(3), 12-15. SEDA.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning
and Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout:
High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education.
Jessop, T. and Tomas, C. 2017. The implications of programme assessment on student learning. Assessment
and Evaluation in Higher Education.
Jessop, T. and Maleckar, B. 2016. The Influence of disciplinary assessment patterns on student learning: a
comparative study. Studies in Higher Education. Published Online 27 August 2014
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a
nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 -217.
Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems’, Instructional Science,
18(2), pp. 119–144.
Wu, Q. and Jessop, T. 2018. Formative assessment: missing in action in both research-intensive and teaching-
focused universities. Assessment and Evaluation in Higher Education. Published online 15 January.

Improving student learning: Masterclass part 2

  • 1.
    Interactive TESTA masterclass Part2 @solentlearning @tansyjtweets Tansy Jessop TESTA Project Leader Durham University 3 May 2018
  • 2.
  • 3.
    The feedback is generallyfocused on the module Because it’s at the end of the module, it doesn’t feed into our future work. If It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other. I read it and think “Well, that’s fine but I’ve already handed it in now and got the mark. It’s too late”. STRUCTURAL
  • 4.
    It was like‘Who’s Holly?’ It’s that relationship where you’re just a student. Because they have to mark so many that our essay becomes lost in the sea that they have to mark. Here they say ‘Oh yes, I don’t know who you are. Got too many to remember, don’t really care, I’ll mark you on your assignment’. RELATIONAL
  • 5.
  • 6.
    Irretrievable breakdown… Your essaylacked structure and your referencing is problematic Your classes are boring and I don’t really like you 
  • 7.
    Ways to bedialogic • Conversation: who starts the dialogue? • Cycles of reflection across modules • Quick generic feedback • Feedback synthesis tasks • Peer feedback (especially on formative) • Technology: audio, screencast and blogging • From feedback as ‘telling’… • … to feedback as asking questions
  • 8.
  • 9.
    Students to lecturers: CriticalIncident Questionnaire Stephen Brookfield’s Critical Incident Questionnaire http://bit.ly/1loUzq0
  • 10.
    Pause, review, discuss (indiscipline area groups?) • What one feedback problem raised in TESTA data resonates for you? • What ideas or practices presented would work in your context (or are already working)? Any new and different ideas? • How can you implement these ideas to engage students meaningfully in using feedback?
  • 11.
    Feedback manifesto Groups worktogether to produce a set of feedback principles that they commit to embedding in their programmes. Flipchart paper on walls in plenary.
  • 12.
    Theme 3: Confusionabout goals and standards • Consistently low scores on the AEQ for clear goals and standards • Alienation from the tools, especially criteria and guidelines • Symptoms: perceptions of marker variation, unfair standards and inconsistencies in practice
  • 13.
    We’ve got two tutors-one marks completely differently to the other and it’s pot luck which one you get. They read the essay and then they get a general impression, then they pluck a mark from the air. It’s like Russian roulette – you may shoot yourself and then get an A1. They have different criteria, they build up their own criteria.
  • 14.
    There are criteria,but I find them really strange. There’s “writing coherently, making sure the argument that you present is backed up with evidence”.
  • 15.
  • 16.
    Taking action: internalisinggoals and standards • Regular calibration exercises • Discussion and dialogue • Discipline specific criteria (not cut and paste) Lecturers • Rewrite/co-create criteria • Marking exercises • Discussing range of exemplars Lecturers and students • Enter secret garden - peer review • Engage in drafting processes • Self-reflection Students
  • 17.
    How not todo it: reverse engineering How not to get an idea of standards: Students Half the participants work in groups of five using flipchart paper to design a system to ensure that students never come to know what ‘good’ looks like. Use flipchart paper, text and drawings. How not to mark to a common standard: Academics The other half design a system which ensures that academics are prevented from marking to the same standard. Use flipchart paper, text and drawings.
  • 18.
  • 19.
  • 20.
  • 21.
    References Barlow, A. andJessop, T. 2016. “You can’t write a load of rubbish”: Why blogging works as formative assessment. Educational Developments. 17(3), 12-15. SEDA. Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education. Jessop, T. and Tomas, C. 2017. The implications of programme assessment on student learning. Assessment and Evaluation in Higher Education. Jessop, T. and Maleckar, B. 2016. The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014 Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88. Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517. O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 -217. Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems’, Instructional Science, 18(2), pp. 119–144. Wu, Q. and Jessop, T. 2018. Formative assessment: missing in action in both research-intensive and teaching- focused universities. Assessment and Evaluation in Higher Education. Published online 15 January.

Editor's Notes

  • #2 Tansy
  • #10 Is anyone listening?
  • #17 Students can increase their understanding of the language of assessment through their active engagement in: ‘observation, imitation, dialogue and practice’ (Rust, Price, and O’Donovan 2003, 152), Dialogue, clever strategies, social practice, relationship building, relinquishing power.