Why a programme view?
Why TESTA?
Professor Tansy Jessop
TESTA Workshop
Trinity College Dublin
9 February 2017
Jottings
1. One thing you already know about TESTA
2. One problem you have faced with assessment
3. One problem you have faced with feedback
4. One blue sky idea to address a problem
What I am hoping to achieve today
1. Brief overview of TESTA
2. Why people find it useful
3. Three problems TESTA addresses
4. Four themes in the data
5. Solutions: a taster
Mixed Methods approach
Programme
Team
Meeting
Assessment
Experience
Questionnaire
(AEQ)
TESTA
Programme
Audit
Student
Focus Groups
Sustained growth
TESTA….
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
It enables you to see the whole elephant
Three problemsThree problems
Problem 1: Something awry not sure why
Problem 2: Curriculum design problem
Problem 3: The problem of educational change
1. Something awry, not sure why
Wow! Our students
love History! Fantastic!
Whoops there’s a little
problem here
Fix it!
Ok, we’ll look especially at polishing
up our feedback. Students seems to
find that the least best thing.
Apply spit and polish
Anyone for the feedback sandwich?
I cushion the
blow!
The hard truths are nicely
disguised!
Me too - nice and
soft!
Problem 2: Curriculum design problem
Does IKEA 101 work for complex learning?
Curriculum privileges ‘knowing’ stuff
“Content is often the most visible aspect
for students, the control of which is
frequently devolved to individual
academics, who receive little or no training
in curriculum design and planning”
(Blackmore and Kandiko 2014, 7).
Blunt instrument curriculum
Problem 3: Educational change problem
Three misguided assumptions:
1. There is not enough high
quality data.
2. Data will do it
3. Academics will buy it.
http://www.liberalarts.wabash.edu/study-overview/
Proving is different from improving
“It is incredibly difficult to translate assessment
evidence into improvements in student learning”
“It’s far less risky and complicated to analyze data
than it is to act”
(Blaich & Wise, 2011)
Paradigm What it looks like
Technical rational Focus on data and tools
Relational Focus on people
Emancipatory Focus on systems and structures
TESTA themes and impacts
1. Variations in assessment patterns
2. High summative: low formative
3. Disconnected feedback
4. Lack of clarity about goals and standards
Defining the terms
• Summative assessment carries a grade which
counts toward the degree classification.
• Formative assessment does not count
towards the degree (either pass/fail or a
grade), elicits comments and is required to be
done by all students.
1. Huge variations
• What is striking for
you about this data?
• How does it compare
with your context?
• Does variation
matter?
Assessment features across a 3 year UG degree (n=73)
Characteristic Range
Summative 12 -227
Formative 0 - 116
Varieties of assessment 5 - 21
Proportion of examinations 0% - 87%
Time to return marks & feedback 10 - 42 days
Volume of oral feedback 37 -1800 minutes
Volume of written feedback 936 - 22,000 words
Theme 2: High summative: low formative
• Summative ‘pedagogies of control’
• Circa 2 per module in UK
• Ratio of 1:8 of formative to summative
• Formative weakly understood and practised
Assessment Arms Race
What students say about high summative
• A lot of people don’t do wider reading. You just focus
on your essay question.
• In Weeks 9 to 12 there is hardly anyone in our
lectures. I'd rather use those two hours of lectures to
get the assignment done.
• It’s been non-stop assignments, and I’m now free of
assignments until the exams – I’ve had to rush every
piece of work I’ve done.
What students say about formative
• If there are no actual consequences of not
doing it, most students are going to sit in the
bar.
• The lecturers do formative assessment but we
don’t get any feedback on it.
Actions based on evidence
1. Rebalance summative and formative
2. Programme approach
3. Formative in the public domain
4. Linking formative and summative
5. Risky, creative, challenging tasks
6. Students reading and producing more
7. Deeper understanding of value of formative
Theme 3: Disconnected feedback
Take five
• Choose a quote that
strikes you.
• What is the key issue?
• What strategies might
address this issue?
What students say…
It’s difficult because your assignments are so detached
from the next one you do for that subject. They don’t
relate to each other.
Because it’s at the end of the module, it doesn’t feed into
our future work.
Because they have to mark so many that our essay
becomes lost in the sea that they have to mark.
It was like ‘Who’s Holly?’ It’s that relationship where
you’re just a student.
Actions based on evidence
• Conversation: who starts the dialogue?
• Iterative cycles of reflection across modules
• Quick generic feedback: the ‘Sherlock’ factor
• Feedback synthesis tasks
• Technology: audio, screencast and blogging
• From feedback as ‘telling’…
• … to feedback as asking questions
Theme 4: Confusion about goals and
standards
• Consistently low scores on the AEQ for clear
goals and standards
• Alienation from the tools, especially criteria
and guidelines
• Symptoms: perceptions of marker variation,
unfair standards and inconsistencies in practice
What students say…
We’ve got two tutors- one marks completely differently to
the other and it’s pot luck which one you get.
They have different criteria, they build up their own
criteria.
It’s such a guessing game.... You don’t know what they
expect from you.
They read the essay and then they get a general
impression, then they pluck a mark from the air.
Taking action: internalising goals and
standards
• Regular calibration exercises
• Discussion and dialogue
• Discipline specific criteria (no cut and paste)
Lecturers
• Rewrite/co-create criteria
• Marking exercises
• Design and value formative
Lecturers
and students
• Enter secret garden - peer review
• Engage in drafting processes
• Self-reflection
Students
From this educational paradigm…
Transmission Model
Social Constructivist Model
References
Blaich, C., & Wise, K. (2011). From Gathering to Using Assessment Results: Lessons from the Wabash
National Study. Occasional Paper #8. University of Illinois: National Institution for Learning Outcomes
Assessment.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. doi:
10.1080/02602938.2012.691462.
Gibbs, G. & Simpson, C. (2004) Conditions r which assessment supports students' learning. Learning and
Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its
fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher
Jessop, T. and Tomas, C. 2016 The implications of programme assessment on student learning.
Assessment and Evaluation in Higher Education. Published online 2 August 2016.
Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student
learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014
http://www.tandfonline.com/doi/abs/10.1080/03075079.2014.943170
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment
standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 — 217
Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional
Science, 18(2), pp. 119–144. doi: 10.1007/bf00117714.

Why a programme view? Why TESTA?

  • 1.
    Why a programmeview? Why TESTA? Professor Tansy Jessop TESTA Workshop Trinity College Dublin 9 February 2017
  • 2.
    Jottings 1. One thingyou already know about TESTA 2. One problem you have faced with assessment 3. One problem you have faced with feedback 4. One blue sky idea to address a problem
  • 3.
    What I amhoping to achieve today 1. Brief overview of TESTA 2. Why people find it useful 3. Three problems TESTA addresses 4. Four themes in the data 5. Solutions: a taster
  • 5.
  • 6.
  • 7.
    TESTA…. “…is a wayof thinking about assessment and feedback” Graham Gibbs
  • 8.
    It enables youto see the whole elephant
  • 9.
    Three problemsThree problems Problem1: Something awry not sure why Problem 2: Curriculum design problem Problem 3: The problem of educational change
  • 10.
    1. Something awry,not sure why
  • 11.
    Wow! Our students loveHistory! Fantastic!
  • 12.
    Whoops there’s alittle problem here
  • 13.
    Fix it! Ok, we’lllook especially at polishing up our feedback. Students seems to find that the least best thing.
  • 14.
  • 15.
    Anyone for thefeedback sandwich? I cushion the blow! The hard truths are nicely disguised! Me too - nice and soft!
  • 16.
    Problem 2: Curriculumdesign problem
  • 17.
    Does IKEA 101work for complex learning?
  • 18.
    Curriculum privileges ‘knowing’stuff “Content is often the most visible aspect for students, the control of which is frequently devolved to individual academics, who receive little or no training in curriculum design and planning” (Blackmore and Kandiko 2014, 7).
  • 19.
  • 20.
    Problem 3: Educationalchange problem Three misguided assumptions: 1. There is not enough high quality data. 2. Data will do it 3. Academics will buy it. http://www.liberalarts.wabash.edu/study-overview/
  • 21.
    Proving is differentfrom improving “It is incredibly difficult to translate assessment evidence into improvements in student learning” “It’s far less risky and complicated to analyze data than it is to act” (Blaich & Wise, 2011)
  • 22.
    Paradigm What itlooks like Technical rational Focus on data and tools Relational Focus on people Emancipatory Focus on systems and structures
  • 23.
    TESTA themes andimpacts 1. Variations in assessment patterns 2. High summative: low formative 3. Disconnected feedback 4. Lack of clarity about goals and standards
  • 24.
    Defining the terms •Summative assessment carries a grade which counts toward the degree classification. • Formative assessment does not count towards the degree (either pass/fail or a grade), elicits comments and is required to be done by all students.
  • 25.
    1. Huge variations •What is striking for you about this data? • How does it compare with your context? • Does variation matter?
  • 26.
    Assessment features acrossa 3 year UG degree (n=73) Characteristic Range Summative 12 -227 Formative 0 - 116 Varieties of assessment 5 - 21 Proportion of examinations 0% - 87% Time to return marks & feedback 10 - 42 days Volume of oral feedback 37 -1800 minutes Volume of written feedback 936 - 22,000 words
  • 27.
    Theme 2: Highsummative: low formative • Summative ‘pedagogies of control’ • Circa 2 per module in UK • Ratio of 1:8 of formative to summative • Formative weakly understood and practised
  • 28.
  • 29.
    What students sayabout high summative • A lot of people don’t do wider reading. You just focus on your essay question. • In Weeks 9 to 12 there is hardly anyone in our lectures. I'd rather use those two hours of lectures to get the assignment done. • It’s been non-stop assignments, and I’m now free of assignments until the exams – I’ve had to rush every piece of work I’ve done.
  • 30.
    What students sayabout formative • If there are no actual consequences of not doing it, most students are going to sit in the bar. • The lecturers do formative assessment but we don’t get any feedback on it.
  • 31.
    Actions based onevidence 1. Rebalance summative and formative 2. Programme approach 3. Formative in the public domain 4. Linking formative and summative 5. Risky, creative, challenging tasks 6. Students reading and producing more 7. Deeper understanding of value of formative
  • 32.
  • 33.
    Take five • Choosea quote that strikes you. • What is the key issue? • What strategies might address this issue?
  • 34.
    What students say… It’sdifficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other. Because it’s at the end of the module, it doesn’t feed into our future work. Because they have to mark so many that our essay becomes lost in the sea that they have to mark. It was like ‘Who’s Holly?’ It’s that relationship where you’re just a student.
  • 35.
    Actions based onevidence • Conversation: who starts the dialogue? • Iterative cycles of reflection across modules • Quick generic feedback: the ‘Sherlock’ factor • Feedback synthesis tasks • Technology: audio, screencast and blogging • From feedback as ‘telling’… • … to feedback as asking questions
  • 36.
    Theme 4: Confusionabout goals and standards • Consistently low scores on the AEQ for clear goals and standards • Alienation from the tools, especially criteria and guidelines • Symptoms: perceptions of marker variation, unfair standards and inconsistencies in practice
  • 37.
    What students say… We’vegot two tutors- one marks completely differently to the other and it’s pot luck which one you get. They have different criteria, they build up their own criteria. It’s such a guessing game.... You don’t know what they expect from you. They read the essay and then they get a general impression, then they pluck a mark from the air.
  • 38.
    Taking action: internalisinggoals and standards • Regular calibration exercises • Discussion and dialogue • Discipline specific criteria (no cut and paste) Lecturers • Rewrite/co-create criteria • Marking exercises • Design and value formative Lecturers and students • Enter secret garden - peer review • Engage in drafting processes • Self-reflection Students
  • 39.
  • 40.
  • 41.
  • 42.
    References Blaich, C., &Wise, K. (2011). From Gathering to Using Assessment Results: Lessons from the Wabash National Study. Occasional Paper #8. University of Illinois: National Institution for Learning Outcomes Assessment. Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. doi: 10.1080/02602938.2012.691462. Gibbs, G. & Simpson, C. (2004) Conditions r which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Jessop, T. and Tomas, C. 2016 The implications of programme assessment on student learning. Assessment and Evaluation in Higher Education. Published online 2 August 2016. Jessop, T. and Maleckar, B. (2014). The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014 http://www.tandfonline.com/doi/abs/10.1080/03075079.2014.943170 Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88. Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517. O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 — 217 Sadler, D. R. (1989) ‘Formative assessment and the design of instructional systems’, Instructional Science, 18(2), pp. 119–144. doi: 10.1007/bf00117714.

Editor's Notes

  • #7 How do you measure soft stuff? 5 day cricket match versus 20/20
  • #8 What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Evidence, assessment principles. Habermas framework.
  • #9 I realised what we were saying was ‘That’s only two per module’. And I was like ‘Ah, but that’s the point. This is a programmatic thing and you’re used to thinking about a module’ (Programme Leader, American Studies). Data – persistent problem A&F scores. Traffic light systems – green for good. DVC find the people wo are doing well so we can share best practice. Three programmes. Neil McCaw
  • #11 Data – persistent problem A&F scores. Traffic light systems – green for good. DVC find the people wo are doing well so we can share best practice. Three programmes. Neil McCaw
  • #16 Honest dialogue vs tricks of dialogue to minimise damage
  • #18 Hard to make connections, difficult to see the joins between assessments, much more assessment, much more assessment to accredit each little box. Multiplier effect. Less challenge, less integration. Lots of little neo-liberal tasks. The Assessment Arms Race.
  • #20 Language of ‘covering material’ Should we be surprised?
  • #21 Wabash study – 2005-2011, 17,000 students in 49 American colleges. 60-70 publications Critical thinking, moral reasoning, leadership towards social justice, engagement in diversity, deep intellectual work.
  • #23 TESTA has done the data and that’s been useful. Ideological compromises. Mixed methods approaches. Critical pedagogy sleeping with the enemy. Democratic, participatory, liberating curriculum and pedagogy. Teachers and students shape and change education. Resist managerialism and the market. Risky pedagogies.
  • #29 Teach Less, learn more. Assess less, learn more.
  • #39 Students can increase their understanding of the language of assessment through their active engagement in: ‘observation, imitation, dialogue and practice’ (Rust, Price, and O’Donovan 2003, 152), Dialogue, clever strategies, social practice, relationship building, relinquishing power.