SlideShare a Scribd company logo
Zoom to wide-angle lens: whole
programme view of assessment
and feedback
Dr Tansy Jessop, TESTA , University of Winchester
SIAST, Universities of Regina and Saskatchewan,
Briercrest College & Seminary, RCMP College
6 November 2013
1) Introductions and questions
2) TESTA Overview
3) Research Methodology
4) Two case studies -two paradigms?
Q&A
5) Variations
6) Themes across the data
7) Change approaches
Q&A
Webinar Overview
5-10 minutes at the start for introductions
 Your name and role
 Your discipline/career background
 The best assessment or feedback ‘tactic’ you have
encountered and why you think it worked
 Your ‘wicked question’ or assessment problem.
Introductions
 Why assessment and feedback?
 Why UK students rank A&F lower on
National Student Survey?
 Why take a ‘programme-level’ view?
 What are systems unintentionally doing to
student learning?
The big questions
University of Winchester
Arts
Humanities
Education
Social Sciences
Business & Law
3,000 full time students
5,000 altogether
150 PhD students
Mature students
 Higher Education Academy funded research (2009-12)
 Seven programmes in 4 partner universities
 Mapping programme-wide assessment
 Student voice and student perception data
 Evidence and principles to act on
About TESTA
Transforming the Experience of Students through Assessment
TESTA ‘Cathedrals Group’ Universities
Edinburgh
Edinburgh Napier
Greenwich
Canterbury Christchurch
Glasgow
Lady Irwin College University of Delhi
 Website hits map of world
TESTA
“…is a way of thinking
about assessment and
feedback”
Graham Gibbs
 Captures and distributes sufficient student time and
effort - time on task
 Challenging learning with clear goals and standards,
encouraging deep learning
 Sufficient, high quality feedback, received on time, with a
focus on learning
 Students pay attention to the feedback and it guides
future studies – feeding-forward
 Students are able to judge their own performance
accurately, self-regulating
Based on assessment principles
http://www2.glos.ac.uk/offload/tli/lets/lathe/issue1/articles/simpson.pdf
TESTA Research Methods
(Drawing on Gibbs and Dunbar-Goddet, 2008,2009)
ASSESSMENT
EXPERIENCE
QUESTIONNAIRE
FOCUS GROUPS
PROGRAMME AUDIT
Programme
Team
Meeting
 Number of assessment tasks
 Summative/formative
 Variety
 Proportion of exams
 Oral feedback
 Written feedback
 Speed of return of feedback
 Specificity of criteria, aims and learning outcomes.
Audit in a nutshell
 Quantity of Effort
 Coverage of content and knowledge
 Clear goals and standards
 Quantity and Quality of Feedback
 Use of feedback
 Appropriate assessment
 Learning from exams
 Deep and surface learning
Assessment Experience
Questionnaire
Focus Groups
 Student voice and narrative
 Explanation
 Corroboration & contradiction
 Compelling evidence with the stats
 tells a good story
 raises a thought-provoking issue
 has elements of conflict
 promotes empathy with the central characters
 lacks an obvious, clear-cut answer
 takes a position, demands a decision &
 is relatively concise (Gross-Davis 1993)
Case Study…
Case Study X: what’s going on?
 Committed and innovative lecturers
 Lots of coursework, of very varied forms
 No exams
 Masses of written feedback on assignments (15,000 words)
 Learning outcomes and criteria clearly specified
….looks like a ‘model’ assessment environment
But students:
 Don’t put in a lot of effort and distribute their effort across few topics
 Don’t think there is a lot of feedback or that it very useful, and don’t
make use of it
 Don’t think it is at all clear what the goals and standards
 …are unhappy
A. Variety of assessment types confuses students
B. Assessment in ‘bunched’ at certain times
C. Too much feedback, too late
D. Teachers mark differently; students are uncertain about the
standards
E. A diet of all coursework and no exams leads to lack of
integration and synthesis
• Select your response from the buttons (A B C D E) at the bottom-
right of the list of participants
• Type any other comments/ideas/thoughts into the text-chat
What’s the main problem?
Case Study Y: what’s going on?
 35 summative assessments
 No formative assessment specified in documents
 Learning outcomes and criteria wordy and woolly
 Marking by global, tacit, professional judgements
 Teaching staff mainly part-time and hourly paid
….looks like a problematic assessment environment
But students:
 Put in a lot of effort and distribute their effort across topics
 Have a very clear idea of goals and standards
 Are self-regulating and have a good idea of how to close the gap
A. Students are sharing and reviewing work in class
B. Peer review is helping students to understand goals and
standards
C. Inadequate documentation is driving more dialogue
D. Students are more self-reliant than teacher-reliant
E. Authentic assessment linked to real-life processes and
products
• Select your response from the buttons (A B C D E) at the bottom-
right of the list of participants
• Type any other comments/ideas/thoughts into the text-chat
What’s working?
Transmission model
Expert to novice
Planned & ‘delivered’
Feedback by experts
Feedback to novices
Privatised
Monologue
Emphasis on measuring
Competition
Metaphor - machine
Social constructivist model
Participatory, democratic
Messy and process-oriented
Peer review
Self-evaluation
Social process
Dialogue
Emphasis on learning
Collaboration
Metaphor - the journey
Two paradigms
Q & A
 Between 12 and 68 summative tasks
 Between 0 and 55 formative tasks
 From 7 to 17 different types of assessment
 Feedback returned within 10 - 35 days
 936 written words of feedback to 15,412
words
 37 minutes to 30 hours of oral feedback
 0% to 79% of assessment by exams
Variations on 23 UG programmes
in 8 UK universities
A. Influence of different disciplinary practices
B. Individual and modular autonomy
C. Lack of whole programme design
D. Absence of comparable standards across universities
E. Assessment lower down the pecking order than content,
knowledge, skills etc.
• Select your response from the buttons (A B C D E) at the bottom-
right of the list of participants
• Type any other comments/thoughts/ideas into the text-chat
Why so many variations?
My hunch: assessment is an
afterthought in curriculum design
How instructors view L & T
What course content?
What outcomes?
What methods?
What assessment?
How students view L&T
How will I be assessed?
What do I need to know?
What are the objectives?
What approaches to study
should I take?
http://www.cshe.unimelb.edu.au/assessinglearning/
Theme 1: Lack of formative assessment
Theme 2: Systems failure on time-on-task
Theme 3: Feedback problems
Theme 4: Student bewilderment about goals &
standards
Four themes
“Formative assessment is concerned with how
judgements about the quality of student responses can
be used to shape and improve students’ competence by
short-circuiting the randomness and inefficiency of trial-
and-error learning” (Sadler, 1989, p.120).
TESTA: unmarked, required, eliciting feedback
Theme 1: Lack of formative assessment
Rethinking formative and summative
Tasting soup (Stake, 1991)
Formative
Summative
 It was really useful. We were assessed on it but we weren’t officially
given a grade, but they did give us feedback on how we did.
 It didn’t actually count so that helped quite a lot because it was just
a practice and didn’t really matter what we did and we could learn
from mistakes so that was quite useful.
 Getting feedback from other students in my class helps. I can relate
to what they’re saying and take it on board. I’d just shut down if I
was getting constant feedback from my lecturer.
 I find more helpful the feedback you get in informal ways week by
week, but there are some people who just hammer on about what
will get them a better mark.
The potential
 If there weren’t loads of other assessments, I’d do it.
 If there are no actual consequences of not doing it, most students
are going to sit in the bar.
 It’s good to know you’re being graded because you take it more
seriously.
 I would probably work for tasks, but for a lot of people, if it’s not
going to count towards your degree, why bother?
 The lecturers do formative assessment but we don’t get any
feedback on it.
The barriers…
A. 116
B. 50
C. 25
D. 12
E. 6
• Select your response from the buttons (A B C D E) at the bottom-
right of the list of participants
• Type any other comments/thoughts/ideas into the text-chat
How many x do you need to measure
student achievement in a 3 year
degree?
We could do with more assessments over the course of the year
to make sure that people are actually doing stuff.
We get too much of this end or half way through the term essay
type things. Continual assessments would be so much better.
So you could have a great time doing nothing until like a month
before Christmas and you’d suddenly panic. I prefer steady
deadlines, there’s a gradual move forward, rather than bam!
Theme 2: Time-on-task
Effort Map: Holland Vs Alps (Graham)
Week 6 Week 12
Low Effort
Modest Effort
Max effort
 It was about nine weeks… I’d forgotten what I’d written.
 I read it and think “Well that’s fine, but I’ve already handed it
in now and got the mark. It’s too late”.
 Once the deadline comes up to just look on the Internet and
say ‘Right, that’s my mark. I don’t need to know too much
about why I got it’.
 You know that twenty other people have got the same sort of
comment.
Theme 3: Feedback - common problems
 The feedback is generally focused on the module.
 It’s difficult because your assignments are so detached
from the next one you do for that subject. They don’t
relate to each other.
 Because it’s at the end of the module, it doesn’t feed
into our future work.
 You’ll get really detailed, really commenting feedback
from one tutor and the next tutor will just say ‘Well
done’.
Programme-related feedback issues
 1220 AEQ returns, 23 programmes, 8 universities
 Statistical relationship between the quantity and
quality of feedback and students’ understanding of
goals and standards
 r=0.696, p<0.01
Feedback really matters…
 It is a formal document so the language is quite complex and I’ve
had to read it a good few times to kind of understand what they
are saying.
 Assessment criteria can make you take a really narrow approach.
 It’s such a guessing game.... You don’t know what they expect
from you.
 I don’t have any idea of why it got that mark.
 They read the essay and then they get a general impression, then
they pluck a mark from the air.
 It’s a shot in the dark.
Theme 4: Student bewilderment
on goals and standards
1. Less summative, more formative
2. Multi-stage formative-summative cycles
3. Feedback: dialogue, peer-to-peer; feedback before marks
4. Longer modules, linking and sequencing across modules
5. Attention to timing of tasks, bunching and spreading
6. Quicker return times
7. Streamlining variety of assessment
8. Challenging students to do more, at a higher level
9. Structural changes; integrated synoptic assessments
Changes to assessment patterns
Q&A
Team approach
INDIVIDUALS OR TEAMS?
 Programme evidence brings the team together
 Addresses variations of standards
 The module vs greater good of the programme
 Helps to confront protectionism and silos
 Develops collegiality and conversations about
pedagogy
TESTA is about the team
TESTA is about coherence
www.testa.ac.uk
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students'
learning. Learning and Teaching in Higher Education. 1(1): 3-31.
Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment
environments that support learning. Assessment & Evaluation in Higher Education. 34,4:
481-489.
Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112.
Jessop, T. , El Hakim, Y. and Gibbs, G. (2013) The whole is greater than the sum of its
parts: a large-scale study of students’ learning in response to different assessment
patterns. Assessment and Evaluation in Higher Education. ifirst.
Jessop, T, McNab, N and Gubby, L. (2012) Mind the gap: An analysis of how quality
assurance processes influence programme assessment patterns. Active Learning in
Higher Education. 13(3). 143-154.
Jessop, T., El Hakim and Gibbs (2011) Research Inspiring Change. Educational
Developments. 12(4).
Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in
mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517
Sadler, D.R. (1989) Formative assessment and the design of instructional systems,
Instructional Science, 18, 119-144.
References

More Related Content

What's hot

TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
TESTA, UCL Teaching and Learning Conference Keynote (April 2015) TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
TESTA winch
 
TESTA, Kingston University Keynote
TESTA, Kingston University KeynoteTESTA, Kingston University Keynote
TESTA, Kingston University Keynote
TESTA winch
 
TESTA - UNSW, Sydney Australia (September 2011)
TESTA - UNSW, Sydney Australia (September 2011)  TESTA - UNSW, Sydney Australia (September 2011)
TESTA - UNSW, Sydney Australia (September 2011)
TESTA winch
 
Dispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidenceDispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidence
Tansy Jessop
 
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA winch
 
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA winch
 
Why do TESTA?
Why do TESTA?Why do TESTA?
Why do TESTA?
Tansy Jessop
 
TESTA, HEDG Spring Meeting London (March 2013)
 TESTA, HEDG Spring Meeting London (March 2013) TESTA, HEDG Spring Meeting London (March 2013)
TESTA, HEDG Spring Meeting London (March 2013)
TESTA winch
 
Changing the assessment narrative
Changing the assessment narrativeChanging the assessment narrative
Changing the assessment narrative
Tansy Jessop
 
Portsmouth BAM Knowledge and Learning SIG
Portsmouth BAM Knowledge and Learning SIGPortsmouth BAM Knowledge and Learning SIG
Portsmouth BAM Knowledge and Learning SIG
Tansy Jessop
 
Testa interactive masterclass
Testa interactive masterclassTesta interactive masterclass
Testa interactive masterclass
Tansy Jessop
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
Tansy Jessop
 
MMU TESTA Keynote
MMU TESTA KeynoteMMU TESTA Keynote
MMU TESTA Keynote
Tansy Jessop
 
Squeezing assessment and stretching learning
Squeezing assessment and stretching learningSqueezing assessment and stretching learning
Squeezing assessment and stretching learning
Tansy Jessop
 
Cracking the challenge of formative assessment and feedback
Cracking the challenge of formative assessment and feedbackCracking the challenge of formative assessment and feedback
Cracking the challenge of formative assessment and feedback
Tansy Jessop
 
TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA winch
 
Giving effective feedback and feedforward
Giving effective feedback and feedforwardGiving effective feedback and feedforward
Giving effective feedback and feedforward
Sheila MacNeill
 
Assessment
AssessmentAssessment
Does your feedback feed forward?
Does your feedback feed forward?Does your feedback feed forward?
Does your feedback feed forward?
Eddy White, Ph.D.
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
Tansy Jessop
 

What's hot (20)

TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
TESTA, UCL Teaching and Learning Conference Keynote (April 2015) TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
TESTA, UCL Teaching and Learning Conference Keynote (April 2015)
 
TESTA, Kingston University Keynote
TESTA, Kingston University KeynoteTESTA, Kingston University Keynote
TESTA, Kingston University Keynote
 
TESTA - UNSW, Sydney Australia (September 2011)
TESTA - UNSW, Sydney Australia (September 2011)  TESTA - UNSW, Sydney Australia (September 2011)
TESTA - UNSW, Sydney Australia (September 2011)
 
Dispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidenceDispelling myths; challenging traditions: TESTA evidence
Dispelling myths; challenging traditions: TESTA evidence
 
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)TESTA, Universtiy of Warwick SCAP Conference (July 2013)
TESTA, Universtiy of Warwick SCAP Conference (July 2013)
 
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
TESTA, University of Leeds: 'Talking @ Teaching' (September 2013)
 
Why do TESTA?
Why do TESTA?Why do TESTA?
Why do TESTA?
 
TESTA, HEDG Spring Meeting London (March 2013)
 TESTA, HEDG Spring Meeting London (March 2013) TESTA, HEDG Spring Meeting London (March 2013)
TESTA, HEDG Spring Meeting London (March 2013)
 
Changing the assessment narrative
Changing the assessment narrativeChanging the assessment narrative
Changing the assessment narrative
 
Portsmouth BAM Knowledge and Learning SIG
Portsmouth BAM Knowledge and Learning SIGPortsmouth BAM Knowledge and Learning SIG
Portsmouth BAM Knowledge and Learning SIG
 
Testa interactive masterclass
Testa interactive masterclassTesta interactive masterclass
Testa interactive masterclass
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
 
MMU TESTA Keynote
MMU TESTA KeynoteMMU TESTA Keynote
MMU TESTA Keynote
 
Squeezing assessment and stretching learning
Squeezing assessment and stretching learningSqueezing assessment and stretching learning
Squeezing assessment and stretching learning
 
Cracking the challenge of formative assessment and feedback
Cracking the challenge of formative assessment and feedbackCracking the challenge of formative assessment and feedback
Cracking the challenge of formative assessment and feedback
 
TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)TESTA, Sports Away Day Sheffield Hallam (June 2014)
TESTA, Sports Away Day Sheffield Hallam (June 2014)
 
Giving effective feedback and feedforward
Giving effective feedback and feedforwardGiving effective feedback and feedforward
Giving effective feedback and feedforward
 
Assessment
AssessmentAssessment
Assessment
 
Does your feedback feed forward?
Does your feedback feed forward?Does your feedback feed forward?
Does your feedback feed forward?
 
The why and what of testa
The why and what of testaThe why and what of testa
The why and what of testa
 

Similar to TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)

TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)
TESTA winch
 
TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)
TESTA winch
 
Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Why a programme view? Why TESTA?
Why a programme view? Why TESTA?
Tansy Jessop
 
CAN Conference TESTA Programme Assessment
CAN Conference TESTA Programme Assessment CAN Conference TESTA Programme Assessment
CAN Conference TESTA Programme Assessment
Tansy Jessop
 
TESTA, HEIR Conference Keynote (September 2014)
TESTA, HEIR Conference Keynote (September 2014)TESTA, HEIR Conference Keynote (September 2014)
TESTA, HEIR Conference Keynote (September 2014)
TESTA winch
 
Out of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potentialOut of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potential
Tansy Jessop
 
A broken assessment paradigm?
A broken assessment paradigm?A broken assessment paradigm?
A broken assessment paradigm?
Tansy Jessop
 
Inspiring change in assessment and feedback
Inspiring change in assessment and feedbackInspiring change in assessment and feedback
Inspiring change in assessment and feedback
Tansy Jessop
 
1 why do testa
1 why do testa1 why do testa
1 why do testa
Tansy Jessop
 
Implications of TESTA for curriculum design
Implications of TESTA for curriculum designImplications of TESTA for curriculum design
Implications of TESTA for curriculum design
Tansy Jessop
 
TESTA to FASTECH Presentation
TESTA to FASTECH PresentationTESTA to FASTECH Presentation
TESTA to FASTECH Presentation
Tansy_Jessop
 
Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...
Tansy Jessop
 
Myths to Truths: Acting on evidence about student learning from assessment
Myths to Truths: Acting on evidence about student learning from assessmentMyths to Truths: Acting on evidence about student learning from assessment
Myths to Truths: Acting on evidence about student learning from assessment
Tansy Jessop
 
Out of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potentialOut of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potential
Tansy Jessop
 
TESTA SEDA Keynote Spring 2016
TESTA SEDA Keynote Spring 2016TESTA SEDA Keynote Spring 2016
TESTA SEDA Keynote Spring 2016
Tansy Jessop
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
Tansy Jessop
 
Pigs might fly: changing the assessment narrative through TESTA
Pigs might fly: changing the assessment narrative through TESTAPigs might fly: changing the assessment narrative through TESTA
Pigs might fly: changing the assessment narrative through TESTA
Tansy Jessop
 
From alienation to engagement through a programme assessment approach
From alienation to engagement through a programme assessment approachFrom alienation to engagement through a programme assessment approach
From alienation to engagement through a programme assessment approach
Tansy Jessop
 

Similar to TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013) (18)

TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)TESTA, Durham University (December 2013)
TESTA, Durham University (December 2013)
 
TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)TESTA, University of Greenwich Keynote (July 2013)
TESTA, University of Greenwich Keynote (July 2013)
 
Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Why a programme view? Why TESTA?
Why a programme view? Why TESTA?
 
CAN Conference TESTA Programme Assessment
CAN Conference TESTA Programme Assessment CAN Conference TESTA Programme Assessment
CAN Conference TESTA Programme Assessment
 
TESTA, HEIR Conference Keynote (September 2014)
TESTA, HEIR Conference Keynote (September 2014)TESTA, HEIR Conference Keynote (September 2014)
TESTA, HEIR Conference Keynote (September 2014)
 
Out of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potentialOut of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potential
 
A broken assessment paradigm?
A broken assessment paradigm?A broken assessment paradigm?
A broken assessment paradigm?
 
Inspiring change in assessment and feedback
Inspiring change in assessment and feedbackInspiring change in assessment and feedback
Inspiring change in assessment and feedback
 
1 why do testa
1 why do testa1 why do testa
1 why do testa
 
Implications of TESTA for curriculum design
Implications of TESTA for curriculum designImplications of TESTA for curriculum design
Implications of TESTA for curriculum design
 
TESTA to FASTECH Presentation
TESTA to FASTECH PresentationTESTA to FASTECH Presentation
TESTA to FASTECH Presentation
 
Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...Lights, action, clapperboards: changing how students think and perform throug...
Lights, action, clapperboards: changing how students think and perform throug...
 
Myths to Truths: Acting on evidence about student learning from assessment
Myths to Truths: Acting on evidence about student learning from assessmentMyths to Truths: Acting on evidence about student learning from assessment
Myths to Truths: Acting on evidence about student learning from assessment
 
Out of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potentialOut of the long shadow of the NSS: TESTA's transformative potential
Out of the long shadow of the NSS: TESTA's transformative potential
 
TESTA SEDA Keynote Spring 2016
TESTA SEDA Keynote Spring 2016TESTA SEDA Keynote Spring 2016
TESTA SEDA Keynote Spring 2016
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
 
Pigs might fly: changing the assessment narrative through TESTA
Pigs might fly: changing the assessment narrative through TESTAPigs might fly: changing the assessment narrative through TESTA
Pigs might fly: changing the assessment narrative through TESTA
 
From alienation to engagement through a programme assessment approach
From alienation to engagement through a programme assessment approachFrom alienation to engagement through a programme assessment approach
From alienation to engagement through a programme assessment approach
 

Recently uploaded

MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
bennyroshan06
 
PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
PedroFerreira53928
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
Jisc
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
joachimlavalley1
 
Fish and Chips - have they had their chips
Fish and Chips - have they had their chipsFish and Chips - have they had their chips
Fish and Chips - have they had their chips
GeoBlogs
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
RaedMohamed3
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
Delapenabediema
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
EverAndrsGuerraGuerr
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
Jisc
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
AzmatAli747758
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
Excellence Foundation for South Sudan
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
EduSkills OECD
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
MysoreMuleSoftMeetup
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
Pavel ( NSTU)
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
Celine George
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
Mohd Adib Abd Muin, Senior Lecturer at Universiti Utara Malaysia
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
MIRIAMSALINAS13
 

Recently uploaded (20)

MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
PART A. Introduction to Costumer Service
PART A. Introduction to Costumer ServicePART A. Introduction to Costumer Service
PART A. Introduction to Costumer Service
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 
The approach at University of Liverpool.pptx
The approach at University of Liverpool.pptxThe approach at University of Liverpool.pptx
The approach at University of Liverpool.pptx
 
Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Additional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdfAdditional Benefits for Employee Website.pdf
Additional Benefits for Employee Website.pdf
 
Fish and Chips - have they had their chips
Fish and Chips - have they had their chipsFish and Chips - have they had their chips
Fish and Chips - have they had their chips
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
 
Supporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptxSupporting (UKRI) OA monographs at Salford.pptx
Supporting (UKRI) OA monographs at Salford.pptx
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
 
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
Mule 4.6 & Java 17 Upgrade | MuleSoft Mysore Meetup #46
 
Synthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptxSynthetic Fiber Construction in lab .pptx
Synthetic Fiber Construction in lab .pptx
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 

TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)

  • 1. Zoom to wide-angle lens: whole programme view of assessment and feedback Dr Tansy Jessop, TESTA , University of Winchester SIAST, Universities of Regina and Saskatchewan, Briercrest College & Seminary, RCMP College 6 November 2013
  • 2. 1) Introductions and questions 2) TESTA Overview 3) Research Methodology 4) Two case studies -two paradigms? Q&A 5) Variations 6) Themes across the data 7) Change approaches Q&A Webinar Overview
  • 3. 5-10 minutes at the start for introductions  Your name and role  Your discipline/career background  The best assessment or feedback ‘tactic’ you have encountered and why you think it worked  Your ‘wicked question’ or assessment problem. Introductions
  • 4.  Why assessment and feedback?  Why UK students rank A&F lower on National Student Survey?  Why take a ‘programme-level’ view?  What are systems unintentionally doing to student learning? The big questions
  • 5. University of Winchester Arts Humanities Education Social Sciences Business & Law 3,000 full time students 5,000 altogether 150 PhD students Mature students
  • 6.  Higher Education Academy funded research (2009-12)  Seven programmes in 4 partner universities  Mapping programme-wide assessment  Student voice and student perception data  Evidence and principles to act on About TESTA Transforming the Experience of Students through Assessment
  • 9.  Website hits map of world
  • 10. TESTA “…is a way of thinking about assessment and feedback” Graham Gibbs
  • 11.  Captures and distributes sufficient student time and effort - time on task  Challenging learning with clear goals and standards, encouraging deep learning  Sufficient, high quality feedback, received on time, with a focus on learning  Students pay attention to the feedback and it guides future studies – feeding-forward  Students are able to judge their own performance accurately, self-regulating Based on assessment principles http://www2.glos.ac.uk/offload/tli/lets/lathe/issue1/articles/simpson.pdf
  • 12. TESTA Research Methods (Drawing on Gibbs and Dunbar-Goddet, 2008,2009) ASSESSMENT EXPERIENCE QUESTIONNAIRE FOCUS GROUPS PROGRAMME AUDIT Programme Team Meeting
  • 13.  Number of assessment tasks  Summative/formative  Variety  Proportion of exams  Oral feedback  Written feedback  Speed of return of feedback  Specificity of criteria, aims and learning outcomes. Audit in a nutshell
  • 14.  Quantity of Effort  Coverage of content and knowledge  Clear goals and standards  Quantity and Quality of Feedback  Use of feedback  Appropriate assessment  Learning from exams  Deep and surface learning Assessment Experience Questionnaire
  • 15. Focus Groups  Student voice and narrative  Explanation  Corroboration & contradiction  Compelling evidence with the stats
  • 16.  tells a good story  raises a thought-provoking issue  has elements of conflict  promotes empathy with the central characters  lacks an obvious, clear-cut answer  takes a position, demands a decision &  is relatively concise (Gross-Davis 1993) Case Study…
  • 17. Case Study X: what’s going on?  Committed and innovative lecturers  Lots of coursework, of very varied forms  No exams  Masses of written feedback on assignments (15,000 words)  Learning outcomes and criteria clearly specified ….looks like a ‘model’ assessment environment But students:  Don’t put in a lot of effort and distribute their effort across few topics  Don’t think there is a lot of feedback or that it very useful, and don’t make use of it  Don’t think it is at all clear what the goals and standards  …are unhappy
  • 18. A. Variety of assessment types confuses students B. Assessment in ‘bunched’ at certain times C. Too much feedback, too late D. Teachers mark differently; students are uncertain about the standards E. A diet of all coursework and no exams leads to lack of integration and synthesis • Select your response from the buttons (A B C D E) at the bottom- right of the list of participants • Type any other comments/ideas/thoughts into the text-chat What’s the main problem?
  • 19. Case Study Y: what’s going on?  35 summative assessments  No formative assessment specified in documents  Learning outcomes and criteria wordy and woolly  Marking by global, tacit, professional judgements  Teaching staff mainly part-time and hourly paid ….looks like a problematic assessment environment But students:  Put in a lot of effort and distribute their effort across topics  Have a very clear idea of goals and standards  Are self-regulating and have a good idea of how to close the gap
  • 20. A. Students are sharing and reviewing work in class B. Peer review is helping students to understand goals and standards C. Inadequate documentation is driving more dialogue D. Students are more self-reliant than teacher-reliant E. Authentic assessment linked to real-life processes and products • Select your response from the buttons (A B C D E) at the bottom- right of the list of participants • Type any other comments/ideas/thoughts into the text-chat What’s working?
  • 21. Transmission model Expert to novice Planned & ‘delivered’ Feedback by experts Feedback to novices Privatised Monologue Emphasis on measuring Competition Metaphor - machine Social constructivist model Participatory, democratic Messy and process-oriented Peer review Self-evaluation Social process Dialogue Emphasis on learning Collaboration Metaphor - the journey Two paradigms
  • 22. Q & A
  • 23.  Between 12 and 68 summative tasks  Between 0 and 55 formative tasks  From 7 to 17 different types of assessment  Feedback returned within 10 - 35 days  936 written words of feedback to 15,412 words  37 minutes to 30 hours of oral feedback  0% to 79% of assessment by exams Variations on 23 UG programmes in 8 UK universities
  • 24. A. Influence of different disciplinary practices B. Individual and modular autonomy C. Lack of whole programme design D. Absence of comparable standards across universities E. Assessment lower down the pecking order than content, knowledge, skills etc. • Select your response from the buttons (A B C D E) at the bottom- right of the list of participants • Type any other comments/thoughts/ideas into the text-chat Why so many variations?
  • 25. My hunch: assessment is an afterthought in curriculum design How instructors view L & T What course content? What outcomes? What methods? What assessment? How students view L&T How will I be assessed? What do I need to know? What are the objectives? What approaches to study should I take? http://www.cshe.unimelb.edu.au/assessinglearning/
  • 26. Theme 1: Lack of formative assessment Theme 2: Systems failure on time-on-task Theme 3: Feedback problems Theme 4: Student bewilderment about goals & standards Four themes
  • 27. “Formative assessment is concerned with how judgements about the quality of student responses can be used to shape and improve students’ competence by short-circuiting the randomness and inefficiency of trial- and-error learning” (Sadler, 1989, p.120). TESTA: unmarked, required, eliciting feedback Theme 1: Lack of formative assessment
  • 28. Rethinking formative and summative Tasting soup (Stake, 1991) Formative Summative
  • 29.  It was really useful. We were assessed on it but we weren’t officially given a grade, but they did give us feedback on how we did.  It didn’t actually count so that helped quite a lot because it was just a practice and didn’t really matter what we did and we could learn from mistakes so that was quite useful.  Getting feedback from other students in my class helps. I can relate to what they’re saying and take it on board. I’d just shut down if I was getting constant feedback from my lecturer.  I find more helpful the feedback you get in informal ways week by week, but there are some people who just hammer on about what will get them a better mark. The potential
  • 30.  If there weren’t loads of other assessments, I’d do it.  If there are no actual consequences of not doing it, most students are going to sit in the bar.  It’s good to know you’re being graded because you take it more seriously.  I would probably work for tasks, but for a lot of people, if it’s not going to count towards your degree, why bother?  The lecturers do formative assessment but we don’t get any feedback on it. The barriers…
  • 31. A. 116 B. 50 C. 25 D. 12 E. 6 • Select your response from the buttons (A B C D E) at the bottom- right of the list of participants • Type any other comments/thoughts/ideas into the text-chat How many x do you need to measure student achievement in a 3 year degree?
  • 32. We could do with more assessments over the course of the year to make sure that people are actually doing stuff. We get too much of this end or half way through the term essay type things. Continual assessments would be so much better. So you could have a great time doing nothing until like a month before Christmas and you’d suddenly panic. I prefer steady deadlines, there’s a gradual move forward, rather than bam! Theme 2: Time-on-task
  • 33. Effort Map: Holland Vs Alps (Graham) Week 6 Week 12 Low Effort Modest Effort Max effort
  • 34.  It was about nine weeks… I’d forgotten what I’d written.  I read it and think “Well that’s fine, but I’ve already handed it in now and got the mark. It’s too late”.  Once the deadline comes up to just look on the Internet and say ‘Right, that’s my mark. I don’t need to know too much about why I got it’.  You know that twenty other people have got the same sort of comment. Theme 3: Feedback - common problems
  • 35.  The feedback is generally focused on the module.  It’s difficult because your assignments are so detached from the next one you do for that subject. They don’t relate to each other.  Because it’s at the end of the module, it doesn’t feed into our future work.  You’ll get really detailed, really commenting feedback from one tutor and the next tutor will just say ‘Well done’. Programme-related feedback issues
  • 36.  1220 AEQ returns, 23 programmes, 8 universities  Statistical relationship between the quantity and quality of feedback and students’ understanding of goals and standards  r=0.696, p<0.01 Feedback really matters…
  • 37.  It is a formal document so the language is quite complex and I’ve had to read it a good few times to kind of understand what they are saying.  Assessment criteria can make you take a really narrow approach.  It’s such a guessing game.... You don’t know what they expect from you.  I don’t have any idea of why it got that mark.  They read the essay and then they get a general impression, then they pluck a mark from the air.  It’s a shot in the dark. Theme 4: Student bewilderment on goals and standards
  • 38. 1. Less summative, more formative 2. Multi-stage formative-summative cycles 3. Feedback: dialogue, peer-to-peer; feedback before marks 4. Longer modules, linking and sequencing across modules 5. Attention to timing of tasks, bunching and spreading 6. Quicker return times 7. Streamlining variety of assessment 8. Challenging students to do more, at a higher level 9. Structural changes; integrated synoptic assessments Changes to assessment patterns
  • 39. Q&A
  • 41.  Programme evidence brings the team together  Addresses variations of standards  The module vs greater good of the programme  Helps to confront protectionism and silos  Develops collegiality and conversations about pedagogy TESTA is about the team
  • 42. TESTA is about coherence
  • 44. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Gibbs, G. & Dunbar-Goddet, H. (2009). Characterising programme-level assessment environments that support learning. Assessment & Evaluation in Higher Education. 34,4: 481-489. Hattie, J. (2007) The Power of Feedback. Review of Educational Research. 77(1) 81-112. Jessop, T. , El Hakim, Y. and Gibbs, G. (2013) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. ifirst. Jessop, T, McNab, N and Gubby, L. (2012) Mind the gap: An analysis of how quality assurance processes influence programme assessment patterns. Active Learning in Higher Education. 13(3). 143-154. Jessop, T., El Hakim and Gibbs (2011) Research Inspiring Change. Educational Developments. 12(4). Nicol, D. (2010) From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517 Sadler, D.R. (1989) Formative assessment and the design of instructional systems, Instructional Science, 18, 119-144. References

Editor's Notes

  1. Emphasis on measurement; two roles for teachers – judges of performance and achievement; collaborators with students in learning; obvious that we are interested in student learning, but our assessment systems often are perceived more as ways of measuring students than developing their capabilities; assessment and feedback key drivers for learning, Paul Ramsden: assessment always drives the curriculum; where students pay attention; feedback single most important factor in student learning – John Hattie; Black and Wiliam. NSS scores – remain the lowest – 85% of all students in the Uk satisfied with their degree courses; only 70% are satisfied with a & f. Programme level view – this is the most interesting. The rise of modularity and semesterisation; measuring students to death; whither coherence? Rising tide in the UK of thinking about how to constrain choice and get coherence back; thinking, planning in silos. Introduce you to TESTA
  2. What started as a research methodology has become a way of thinking. David Nicol – changing the discourse, the way we think about assessment and feedback; not only technical, research, mapping, also shaping our thinking. Why is that?
  3. Based on robust research methods about whole programmes - 40 audits; 2000 AEQ returns; 50 focus groups
  4. Hard data from chat and documents
  5. More than 50
  6. "How do we create texts that are vital? That are attended to? That make a difference?
  7. Large programme; modular approaches; marker variation, late feedback; dependency on tutors
  8. Student workloads often concentrated around two summative points per module. Sequencing, timing, bunching issues, and ticking off modules so students don’t pay attention to feedback at the end point.
  9. Quote 1: late feedback is when it’s too late to be of use. It needs to get back to them when it still matters. Quote 2 and 3: modular silos impede the transfer and use of feedback, and students are looking for more relationship between tasks within and across modules; Quote 4: Marker variation is rife, and creates wariness/distrust about using feedback; Quote 5: students exposed to lots of carefully scaffolded peer feedback find it invaluable.
  10. Golden thread from feedback to clear goals and standards to overall satisfaction
  11. Limitations of explicit criteria, marker variation is huge, particularly in humanities, arts and professional courses (non science ones) Students haven’t internalised standards which are often tacit. Marking workshops, exemplars, peer review.